Docker Quick Start
Introduction
This guide gets the complete NeuronDB ecosystem running in under 5 minutes using Docker Compose. The ecosystem includes:
- NeuronDB - PostgreSQL extension with vector search, ML inference, and GPU acceleration
- NeuronAgent - REST API and WebSocket agent runtime with long-term memory
- NeuronMCP - Model Context Protocol server with 100+ tools for MCP-compatible clients
- NeuronDesktop - Unified web interface for managing all components
Why Docker? Docker provides the easiest and most consistent setup, with automatic networking, configuration, and GPU support across platforms.
Prerequisites
Before starting, verify you have:
- Docker 20.10+ and Docker Compose 2.0+ installed
- 4GB+ RAM available (8GB recommended)
- Ports available: 5433 (PostgreSQL), 8080 (NeuronAgent), 8081 (NeuronDesktop API), 3000 (NeuronDesktop UI)
- Optional: NVIDIA Docker runtime (CUDA), ROCm drivers (AMD), or Metal support (macOS/Apple Silicon)
Verify Docker installation
docker --version
docker compose versionQuick Start (5 minutes)
Start the complete NeuronDB ecosystem with a single command:
Step 1: Clone the Repository
Clone repository
git clone https://github.com/neurondb-ai/neurondb.git
cd neurondbStep 2: Start All Services
Start ecosystem (CPU profile)
# Start all services with CPU profile (default)
docker compose up -dThis command will:
- Build all Docker images (first time only, takes 5-10 minutes)
- Start PostgreSQL with NeuronDB extension
- Start NeuronAgent (REST API server on port 8080)
- Start NeuronMCP (MCP protocol server)
- Start NeuronDesktop (web UI on port 3000, API on 8081)
- Configure networking between all components
Step 3: Check Service Status
Verify all services are running
docker compose psYou should see five services running with "healthy" status:
neurondb-cpu- PostgreSQL with NeuronDB extensionneuronagent- REST API serverneurondb-mcp- MCP protocol serverneurondesk-api- NeuronDesktop API serverneurondesk-frontend- NeuronDesktop web interface
Wait 30-60 seconds for all services to initialize and show "healthy" status.
Verify Services
Run these quick verification commands to confirm everything is working:
Test 1: NeuronDB Extension
Verify NeuronDB extension
docker compose exec neurondb psql -U neurondb -d neurondb -c "SELECT neurondb.version();"Expected output: 1.0.0
Test 2: NeuronAgent REST API
Check NeuronAgent health
curl http://localhost:8080/healthExpected output: {"status":"ok"}
Test 3: NeuronDesktop API
Check NeuronDesktop API
curl http://localhost:8081/healthExpected output: JSON response with status information
Test 4: First Vector Query
Create extension and test vector search
-- Connect to database
docker compose exec -T neurondb psql -U neurondb -d neurondb <<EOF
-- Create extension
CREATE EXTENSION IF NOT EXISTS neurondb;
-- Create a test table
CREATE TABLE IF NOT EXISTS documents (
id SERIAL PRIMARY KEY,
content TEXT,
embedding vector(1536)
);
-- Insert sample document
INSERT INTO documents (content, embedding)
VALUES ('Hello, NeuronDB!', '[0.1, 0.2, 0.3]'::vector)
ON CONFLICT DO NOTHING;
-- Verify data
SELECT id, content FROM documents;
EOFGPU Profiles
The Docker Compose setup supports multiple GPU profiles for accelerated operations. Choose the profile that matches your hardware:
CPU Profile (Default)
CPU-only setup
docker compose up -dUses port 5433 for PostgreSQL.
CUDA Profile (NVIDIA GPU)
CUDA GPU acceleration
docker compose --profile cuda up -dRequires NVIDIA Docker runtime. Uses port 5434 for PostgreSQL. See CUDA GPU Support for setup details.
ROCm Profile (AMD GPU)
ROCm GPU acceleration
docker compose --profile rocm up -dRequires ROCm drivers. Uses port 5435 for PostgreSQL. See ROCm GPU Support for setup details.
Metal Profile (Apple Silicon)
Metal GPU acceleration (macOS)
docker compose --profile metal up -dFor macOS with Apple Silicon (M1/M2/M3). Uses port 5436 for PostgreSQL. See Metal GPU Support for setup details.
Note: You can run multiple profiles simultaneously on different ports. For example, run both CPU and CUDA profiles side-by-side for testing.
Service URLs & Access
After starting services, access them at:
| Service | URL / Connection | Description |
|---|---|---|
| NeuronDB | postgresql://neurondb:neurondb@localhost:5433/neurondb | PostgreSQL with NeuronDB extension |
| NeuronAgent | http://localhost:8080 | REST API and WebSocket endpoints |
| NeuronMCP | stdio (JSON-RPC 2.0) | MCP protocol server for MCP clients |
| NeuronDesktop UI | http://localhost:3000 | Web-based management interface |
| NeuronDesktop API | http://localhost:8081 | Backend API for NeuronDesktop |
Common Commands
Service management
# Stop all services (keep data)
docker compose down
# Stop and remove all data volumes
docker compose down -v
# View logs from all services
docker compose logs -f
# View logs from specific service
docker compose logs -f neurondb
docker compose logs -f neuronagent
# Restart a specific service
docker compose restart neurondbNext Steps
Now that your ecosystem is running, explore these resources:
- Quick Start Guide - Create your first vector table, generate embeddings, and run semantic search queries
- NeuronAgent Documentation - Build AI agents with REST API, WebSocket, and long-term memory
- NeuronMCP Documentation - Use 100+ MCP tools with Claude Desktop and other MCP clients
- NeuronDesktop Documentation - Manage your ecosystem through the unified web interface
- Vector Indexing - Configure HNSW, IVF, and quantization for production-scale search
- RAG Pipelines - Build retrieval augmented generation workflows in PostgreSQL
Troubleshooting
Having issues? Check these common problems:
Services Won't Start
Check logs
docker compose logs neurondb
docker compose logs neuronagent
docker compose logs neurondb-mcpPort Already in Use
If ports 5433, 8080, 8081, or 3000 are in use, modify docker-compose.yml or stop conflicting services.
Out of Memory
Ensure Docker has at least 4GB RAM allocated (8GB+ recommended). Check Docker Desktop → Settings → Resources.
GPU Not Detected
For CUDA: Verify NVIDIA Docker runtime is installed. For ROCm: Check that ROCm drivers are available. See GPU Documentation for detailed setup instructions.
For more help, see the Troubleshooting Guide or check service logs with docker compose logs.