How to Use Claude Code with Your Database: SQL Queries, Schema Inspection, and DBHub Setup

AI, Docker, Tips & Tricks, and Tutorials

How to Use Claude Code with Your Database: SQL Queries, Schema Inspection, and DBHub Setup

As developers, we often find ourselves juggling between writing code, inspecting database schemas, and crafting SQL queries. What if you could have an AI assistant that understands your database structure and helps you write queries? That's exactly what DBHub brings to your development workflow.

In this guide, we'll walk through setting up DBHub—a universal database MCP server—in your local environment and connecting it to Claude Code to inspect schemas, generate queries, and safely test them before deploying to production.

What is DBHub?

DBHub is an open-source tool from Bytebase that implements the Model Context Protocol (MCP) server interface. It acts as a bridge between AI coding assistants like Claude Code and your databases, supporting PostgreSQL, MySQL, MariaDB, SQL Server, and SQLite.

Think of it as giving Claude Code direct, controlled access to your database schema—allowing it to:

  • Explore your database structure
  • Understand table relationships
  • Generate accurate SQL queries based on your actual schema
  • Execute read-only queries safely
  • Help you prototype queries locally before running them in production

Why This Matters for Your Development Workflow

Traditional database work involves constant context-switching: you write code in your editor, then jump to a database client to check table structures, write queries, test them, and iterate. With DBHub and Claude Code, you can:

  1. Stay in your flow: Ask Claude Code questions about your database without leaving your terminal
  2. Generate accurate queries: Claude Code sees your actual schema, not generic examples
  3. Test safely: Use read-only mode to explore production databases without risk
  4. Document as you go: Have Claude Code explain complex queries or schema relationships
  5. Learn by doing: Get real-time feedback on query optimization and best practices

Prerequisites

Before we begin, make sure you have:

  • Node.js installed (for running DBHub via npx)
  • Claude Code installed (get started with Claude Code)
  • A local database to connect to (we'll use PostgreSQL in examples, but any supported database works)
  • Or Docker, if you prefer containerized deployment

Part 1: Setting Up DBHub Locally

Option 1: Quick Start with Demo Mode

The fastest way to try DBHub is using the built-in demo mode with a sample employee database:

npx @bytebase/dbhub --transport stdio --demo

This creates an in-memory SQLite database with sample data—perfect for testing the setup before connecting to your real database.

Option 2: Connect to Your Local Database

For PostgreSQL (adjust the connection string for your database):

npx @bytebase/dbhub \
  --transport stdio \
  --dsn "postgres://username:password@localhost:5432/your_database?sslmode=disable"

For MySQL:

npx @bytebase/dbhub \
  --transport stdio \
  --dsn "mysql://username:password@localhost:3306/your_database?sslmode=disable"

For SQL Server (Microsoft SQL Server / MSSQL):

npx @bytebase/dbhub \
  --transport stdio \
  --dsn "sqlserver://username:password@localhost:1433/your_database"

If your SQL Server instance uses a named instance rather than the default port, specify it in the host:

npx @bytebase/dbhub \
  --transport stdio \
  --dsn "sqlserver://username:password@localhost\\SQLEXPRESS:1433/your_database"

Important: If your password contains special characters like @, :, /, #, or &, use environment variables instead:

export DB_TYPE=postgres
export DB_HOST=localhost
export DB_PORT=5432
export DB_USER=myuser
export DB_PASSWORD='my@complex:password/with#special&chars'
export DB_NAME=mydatabase

npx @bytebase/dbhub --transport stdio

Option 3: Using Docker

If you prefer containerized deployment or are integrating DBHub into your development environment:

docker run --rm --init \
  --name dbhub \
  bytebase/dbhub \
  --transport stdio \
  --dsn "postgres://user:password@host.docker.internal:5432/dbname?sslmode=disable"

Note: When running in Docker and connecting to a database on your host machine, use host.docker.internal instead of localhost.

Adding DBHub to Your Docker Compose Stack

If you're already using Docker Compose for your development environment, you can add DBHub alongside your other services:

# docker-compose.yml
version: '3.8'

services:
  postgres:
    image: postgres:15-alpine
    environment:
      POSTGRES_DB: myapp
      POSTGRES_USER: developer
      POSTGRES_PASSWORD: devpass
    ports:
      - "5432:5432"
    volumes:
      - postgres_data:/var/lib/postgresql/data

  dbhub:
    image: bytebase/dbhub:latest
    container_name: dbhub
    ports:
      - "8080:8080"
    environment:
      - DBHUB_LOG_LEVEL=info
    command:
      - --transport
      - http
      - --port
      - "8080"
      - --dsn
      - "postgres://developer:devpass@postgres:5432/myapp"
    depends_on:
      - postgres

volumes:
  postgres_data:

Part 2: Configuring Claude Code

Claude Code uses a configuration file to connect to MCP servers. DBHub is one of several MCP servers that can significantly enhance your development workflow. Here's how to set it up:

Step 1: Configure the MCP Server

Claude Code looks for MCP server configurations. Create or edit the configuration file and add DBHub:

{
  "mcpServers": {
    "dbhub-local": {
      "command": "npx",
      "args": [
        "-y",
        "@bytebase/dbhub",
        "--transport",
        "stdio",
        "--dsn",
        "postgres://username:password@localhost:5432/your_database?sslmode=disable"
      ]
    }
  }
}

For added safety, especially when connecting to databases with real data, use read-only mode:

{
  "mcpServers": {
    "dbhub-local": {
      "command": "npx",
      "args": [
        "-y",
        "@bytebase/dbhub",
        "--transport",
        "stdio",
        "--readonly",
        "--dsn",
        "postgres://username:password@localhost:5432/your_database?sslmode=disable"
      ]
    }
  }
}

In read-only mode, Claude Code can execute SELECT, SHOW, DESCRIBE, and other read operations, but cannot perform INSERT, UPDATE, DELETE, or CREATE operations. This is perfect for exploring production databases safely.

Step 3: Multiple Database Connections

You can configure multiple DBHub instances to work with different databases simultaneously:

{
  "mcpServers": {
    "dbhub-local": {
      "command": "npx",
      "args": ["-y", "@bytebase/dbhub"],
      "env": {
        "ID": "local",
        "DSN": "postgres://dev:devpass@localhost:5432/myapp_dev"
      }
    },
    "dbhub-staging": {
      "command": "npx",
      "args": ["-y", "@bytebase/dbhub"],
      "env": {
        "ID": "staging",
        "READONLY": "true",
        "DSN": "postgres://readonly:readpass@staging.example.com:5432/myapp_staging"
      }
    }
  }
}

The ID parameter ensures Claude Code can distinguish between databases. Tool names will be suffixed with the ID (e.g., execute_sql_local vs execute_sql_staging).

Part 3: Using Claude Code to Inspect Your Database

Once configured, restart Claude Code and you're ready to explore your database with AI assistance.

Exploring Schema Structure

Simply ask Claude Code about your database:

You: "What tables do we have in this database?"

Claude Code will query DBHub to list all tables in your schema.

You: "Show me the structure of the users table"

Claude Code will display the table structure, including columns, data types, constraints, and indexes.

Understanding Relationships

You: "What's the relationship between the users and orders tables?"

Claude Code can analyze foreign keys and explain how tables relate to each other.

Generating Queries

Here's where it gets powerful:

You: "Write a query to get all active users who made orders in the last 30 days"

Claude Code will:

  1. Inspect the relevant table structures
  2. Understand column names and data types
  3. Generate an accurate SQL query based on your actual schema
-- Claude Code generates:
SELECT DISTINCT u.*
FROM users u
INNER JOIN orders o ON u.id = o.user_id
WHERE u.status = 'active'
  AND o.created_at >= NOW() - INTERVAL '30 days'
ORDER BY u.created_at DESC;

Testing Queries Locally

You: "Execute this query and show me the first 5 results"

If not in read-only mode, Claude Code can execute queries directly and show you results—perfect for iterative development.

Part 4: Safe Production Database Access

One of DBHub's most powerful features is the ability to connect to production databases safely using read-only mode.

Setting Up a Read-Only Production Connection

{
  "mcpServers": {
    "dbhub-production": {
      "command": "npx",
      "args": [
        "-y",
        "@bytebase/dbhub",
        "--transport",
        "stdio",
        "--readonly",
        "--max-rows",
        "1000",
        "--dsn",
        "postgres://readonly_user:secure_password@production.example.com:5432/myapp_prod?sslmode=require"
      ]
    }
  }
}

Key safety features:

  • --readonly: Prevents any write operations
  • --max-rows 1000: Limits the number of rows returned to prevent accidentally fetching huge datasets
  • sslmode=require: Ensures encrypted connections
  • Use a database user with read-only permissions

Connecting Through SSH Tunnels

For databases behind firewalls or in private networks:

npx @bytebase/dbhub \
  --dsn "postgres://dbuser:dbpass@internal-database:5432/mydb" \
  --ssh-host bastion.example.com \
  --ssh-user ubuntu \
  --ssh-key ~/.ssh/production_key.pem \
  --readonly

Or, if you have SSH config already set up:

# ~/.ssh/config contains:
# Host prod-bastion
#   HostName bastion.example.com
#   User ubuntu
#   IdentityFile ~/.ssh/production_key.pem

npx @bytebase/dbhub \
  --dsn "postgres://dbuser:dbpass@internal-database:5432/mydb" \
  --ssh-host prod-bastion \
  --readonly

Part 5: Practical Workflows

Workflow 1: Local Query Development

  1. Connect DBHub to your local database with a copy of production data
  2. Ask Claude Code to help you write complex queries
  3. Test and iterate queries locally
  4. Copy the final query to your application code or migration files
  5. Deploy with confidence knowing the query works

Workflow 2: Production Debugging

  1. Connect DBHub to production in read-only mode
  2. Ask Claude Code to help diagnose data issues:
    • Show me users who have orphaned records in the orders table
    • Find duplicate email addresses in the users table
    • What's the distribution of order statuses in the last week?
  3. Generate fix queries locally based on findings
  4. Test fixes in your local or staging environment
  5. Apply to production through your normal deployment process

Workflow 3: Schema Documentation

  1. Connect to your database
  2. Ask Claude Code to document complex parts of your schema:
    • Explain how the payment processing tables work together
    • Document the user permissions system
    • What are all the possible states for an order?
  3. Save the explanations in your project documentation

Workflow 4: Query Optimization

You: "Here's a slow query I'm working on: [paste query].
How can I optimize it based on our current indexes?"

Claude Code will:

  1. Examine the query
  2. Check available indexes on the tables
  3. Suggest optimizations specific to your schema
  4. Explain the reasoning

Best Practices and Security Tips

1. Use Read-Only Mode for Production

Always use --readonly when connecting to production databases. Create a dedicated read-only database user:

-- PostgreSQL example
CREATE USER readonly_user WITH PASSWORD 'secure_password';
GRANT CONNECT ON DATABASE myapp_prod TO readonly_user;
GRANT USAGE ON SCHEMA public TO readonly_user;
GRANT SELECT ON ALL TABLES IN SCHEMA public TO readonly_user;
ALTER DEFAULT PRIVILEGES IN SCHEMA public
  GRANT SELECT ON TABLES TO readonly_user;

2. Limit Result Set Sizes

Use --max-rows to prevent accidentally querying millions of rows:

npx @bytebase/dbhub --max-rows 1000 --dsn "..."

3. Store Credentials Securely

Instead of hardcoding credentials in configuration:

# Use environment variables
export DB_DSN="postgres://user:password@localhost:5432/db"
npx @bytebase/dbhub --transport stdio

Or use a .env file (and add it to .gitignore):

# .env.local
DSN=postgres://user:password@localhost:5432/myapp
READONLY=true
MAX_ROWS=1000

4. Use Different Instances for Different Environments

Configure separate DBHub instances with appropriate permissions:

  • Local: Full access for development
  • Staging: Read-write for testing
  • Production: Read-only with limited rows

5. Leverage SSL Connections

For any non-local connection, require SSL:

# PostgreSQL with SSL
postgres://user:password@host:5432/db?sslmode=require

# MySQL with SSL
mysql://user:password@host:3306/db?tls=true

Troubleshooting Common Issues

DBHub Won't Connect

  • Verify database credentials and network connectivity
  • Check if your database allows connections from your IP
  • For Docker: ensure you're using host.docker.internal for host databases
  • For special characters in passwords: use environment variables instead of DSN

Claude Code Doesn't Show DBHub Tools

  • Verify the MCP configuration is valid JSON
  • Restart Claude Code after configuration changes
  • Check Claude Code logs for connection errors

Queries Are Timing Out

  • Add --max-rows to limit result sizes
  • Ensure your database has appropriate indexes
  • For large tables, use LIMIT clauses in your queries

SSH Tunnel Issues

  • Verify SSH key permissions: chmod 600 ~/.ssh/your_key.pem
  • Test SSH connection manually first: ssh -i ~/.ssh/your_key.pem user@host
  • Check firewall rules between bastion and database

Advanced Configuration: Multiple Databases in Docker Compose

Here's a complete development environment with multiple databases:

version: '3.8'

services:
  postgres:
    image: postgres:15-alpine
    environment:
      POSTGRES_DB: myapp
      POSTGRES_USER: developer
      POSTGRES_PASSWORD: devpass
    ports:
      - "5432:5432"
    volumes:
      - postgres_data:/var/lib/postgresql/data

  mysql:
    image: mysql:8.0
    environment:
      MYSQL_DATABASE: analytics
      MYSQL_USER: analyst
      MYSQL_PASSWORD: analyticspass
      MYSQL_ROOT_PASSWORD: rootpass
    ports:
      - "3306:3306"
    volumes:
      - mysql_data:/var/lib/mysql

  dbhub-postgres:
    image: bytebase/dbhub:latest
    command:
      - --transport
      - http
      - --port
      - "8080"
      - --dsn
      - "postgres://developer:devpass@postgres:5432/myapp"
    ports:
      - "8080:8080"
    depends_on:
      - postgres

  dbhub-mysql:
    image: bytebase/dbhub:latest
    command:
      - --transport
      - http
      - --port
      - "8081"
      - --dsn
      - "mysql://analyst:analyticspass@mysql:3306/analytics"
      - --readonly
    ports:
      - "8081:8081"
    depends_on:
      - mysql

volumes:
  postgres_data:
  mysql_data:

Then configure Claude Code to connect to both:

{
  "mcpServers": {
    "dbhub-postgres": {
      "command": "npx",
      "args": ["-y", "@bytebase/dbhub"],
      "env": {
        "ID": "postgres",
        "TRANSPORT": "http",
        "DSN": "http://localhost:8080"
      }
    },
    "dbhub-mysql": {
      "command": "npx",
      "args": ["-y", "@bytebase/dbhub"],
      "env": {
        "ID": "mysql",
        "TRANSPORT": "http",
        "DSN": "http://localhost:8081"
      }
    }
  }
}

Real-World Example: Building a Feature

Let's walk through a complete example of building a new feature using this workflow.

Scenario: You need to add a recently active users dashboard to your application.

Step 1: Explore the Schema

You: "What tables contain user activity data?"

Claude Code inspects the schema and finds users, page_views, and user_sessions tables.

Step 2: Understand the Data

You: "Show me the structure of the user_sessions table
and explain how it relates to users"

Claude Code displays the schema and explains the relationship through foreign keys.

Step 3: Prototype the Query

You: "Write a query to get users who had at least one session
in the last 7 days, including their total session count and
last activity timestamp"

Claude Code generates:

SELECT
  u.id,
  u.username,
  u.email,
  COUNT(us.id) as session_count,
  MAX(us.last_activity_at) as last_active_at
FROM users u
INNER JOIN user_sessions us ON u.id = us.user_id
WHERE us.last_activity_at >= NOW() - INTERVAL '7 days'
GROUP BY u.id, u.username, u.email
ORDER BY last_active_at DESC
LIMIT 100;

Step 4: Test Locally

You: "Execute this query and show me the results"

Claude Code runs the query against your local database and shows sample results.

Step 5: Optimize

You: "Are there any indexes that would help this query perform better?"

Claude Code examines existing indexes and suggests:

CREATE INDEX idx_user_sessions_last_activity
ON user_sessions(last_activity_at)
WHERE last_activity_at >= NOW() - INTERVAL '30 days';

Step 6: Validate Against Production Schema

Switch to your read-only production connection and verify the query works with the production schema (without actually running it on production):

You: "Does this query work with the production schema?"

Step 7: Deploy

Copy the final query to your application code, add the index migration, and deploy through your normal process with DeployHQ.

Integrating with Your DeployHQ Workflow

DBHub and Claude Code fit naturally into a deployment pipeline managed by DeployHQ. Once you've prototyped and tested your queries locally, here's how to bring them into production safely:

Pre-Deployment Validation

Before deploying, use Claude Code with DBHub to verify that your new queries and migrations are compatible with the target database schema. Connect to your staging database in read-only mode and ask Claude Code to validate the migration SQL against the live schema—catching column mismatches or missing indexes before they hit production.

Running Migrations After Deployment

DeployHQ supports SSH commands that run on your server after files are deployed. Use this to execute database migrations automatically as part of your deployment:

# Example SSH command in DeployHQ
cd /var/www/myapp && php artisan migrate --force

Or for a Node.js project using a migration tool like Knex:

cd /var/www/myapp && npx knex migrate:latest --env production

This ensures your schema changes and application code are deployed together in a single, coordinated step. For applications where downtime during migrations is a concern, see our guide on database migration strategies for zero-downtime deployments.

Post-Deployment Verification

After deployment, connect DBHub to your production database in read-only mode and ask Claude Code to verify the migration succeeded—check that new tables exist, columns have the right types, and indexes are in place. This gives you immediate confidence that everything applied correctly without needing to SSH into the server manually.

Version Control Integration

Keep your migration files and query changes in Git, and let DeployHQ handle the deployment from your repository. The workflow becomes: prototype with Claude Code and DBHub locally, commit the migration, push, and let DeployHQ deploy and run the migration on your server automatically.

Conclusion

DBHub bridges the gap between AI-assisted development and database work, making it easier to write accurate queries, understand complex schemas, and safely explore production data. By integrating it with Claude Code, you can:

  • Reduce context-switching during development
  • Generate more accurate SQL based on your actual schema
  • Safely explore production databases
  • Document database structures naturally through conversation
  • Learn database best practices as you work

The combination of local development with full access and production access with read-only safety gives you the flexibility to work efficiently while maintaining security and data integrity.

Next Steps

  1. Try the demo mode to get familiar with the workflow
  2. Connect to your local database and explore your schema
  3. Set up read-only production access for safe querying
  4. Integrate into your team's workflow by sharing the configuration

Ready to streamline your entire deployment pipeline—from database migrations to production releases? Sign up for DeployHQ and connect your Git repositories for automated, zero-downtime deployments.


DBHub is an open-source project from Bytebase. For more information, visit the DBHub GitHub repository.

Questions about integrating database workflows into your deployment process? Reach out at support@deployhq.com or find us on X (@deployhq).