Show HN: Cloud-Ready Postgres MCP Server

2025-03-303:1415166github.com

Contribute to stuzero/pg-mcp development by creating an account on GitHub.

A Model Context Protocol (MCP) server for PostgreSQL databases with enhanced capabilities for AI agents.

PG-MCP is a server implementation of the Model Context Protocol for PostgreSQL databases. It provides a comprehensive API for AI agents to discover, connect to, query, and understand PostgreSQL databases through MCP's resource-oriented architecture.

This implementation builds upon and extends the reference Postgres MCP implementation with several key enhancements:

  1. Full Server Implementation: Built as a complete server with SSE transport for production use
  2. Multi-database Support: Connect to multiple PostgreSQL databases simultaneously
  3. Rich Catalog Information: Extracts and exposes table/column descriptions from the database catalog
  4. Extension Context: Provides detailed YAML-based knowledge about PostgreSQL extensions like PostGIS and pgvector
  5. Query Explanation: Includes a dedicated tool for analyzing query execution plans
  6. Robust Connection Management: Proper lifecycle for database connections with secure connection ID handling
  • Connect Tool: Register PostgreSQL connection strings and get a secure connection ID
  • Disconnect Tool: Explicitly close database connections when done
  • Connection Pooling: Efficient connection management with pooling
  • pg_query: Execute read-only SQL queries using a connection ID
  • pg_explain: Analyze query execution plans in JSON format
  • List schemas with descriptions
  • List tables with descriptions and row counts
  • Get column details with data types and descriptions
  • View table constraints and indexes
  • Explore database extensions
  • Sample table data (with pagination)
  • Get approximate row counts

Built-in contextual information for PostgreSQL extensions like:

  • PostGIS: Spatial data types, functions, and examples
  • pgvector: Vector similarity search functions and best practices

Additional extensions can be easily added via YAML config files.

  • Python 3.13+
  • PostgreSQL database(s)
# Clone the repository
git clone https://github.com/stuzero/pg-mcp.git
cd pg-mcp # Build and run with Docker Compose
docker-compose up -d
# Clone the repository
git clone https://github.com/stuzero/pg-mcp.git
cd pg-mcp # Create and activate a virtual environment
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate # Install using uv
uv sync --frozen # Run the server
python -m server.app

The repository includes test scripts to verify server functionality:

# Basic server functionality test
python test.py "postgresql://username:password@hostname:port/database" # Claude-powered natural language to SQL conversion
python client/claude_cli.py "Show me the top 5 customers by total sales"

The claude_cli.py script requires environment variables:

# .env file
DATABASE_URL=postgresql://username:password@hostname:port/database
ANTHROPIC_API_KEY=your-anthropic-api-key
PG_MCP_URL=http://localhost:8000/sse

Example prompt for use with agents:

Use the PostgreSQL MCP server to analyze the database. 
Available tools:
- connect: Register a database connection string and get a connection ID
- disconnect: Close a database connection
- pg_query: Execute SQL queries using a connection ID
- pg_explain: Get query execution plans

You can explore schema resources via:
pgmcp://{conn_id}/schemas
pgmcp://{conn_id}/schemas/{schema}/tables
pgmcp://{conn_id}/schemas/{schema}/tables/{table}/columns

This server is built on:

  • MCP: The Model Context Protocol foundation
  • FastMCP: Python library for MCP
  • asyncpg: Asynchronous PostgreSQL client
  • YAML: For extension context information
  • The server runs in read-only mode by default (enforced via transaction settings)
  • Connection details are never exposed in resource URLs, only opaque connection IDs
  • Database credentials only need to be sent once during the initial connection

Contributions are welcome! Areas for expansion:

  • Additional PostgreSQL extension context files
  • More schema introspection resources
  • Query optimization suggestions

Read the original article

Comments

  • By saberience 2025-03-3010:573 reply

    Just for everyone here, the code for "building an MCP server", is importing the standard MCP package for Typescript, Python, etc, then writing as little as 10 lines of code to define something is an MCP tool.

    Basically, it's not rocket science. I also built MCP servers for Mysql, Twilio, Polars, etc.

    • By spennant 2025-03-3014:311 reply

      I built this to fill a specific need. It's not really made to talk to Claude Desktop (although it can). It's built to take multiple connections from agents who bring their own LLM (via API key or whatever) and provide context to the LLM model. Yes, it does import the standard Python MCP package, however it does quite a few non-standard things in order to achieve the multi-tenancy. I encourage you to look at the code.

    • By runako 2025-03-3012:282 reply

      From HN guidelines:

      > Please don't post shallow dismissals, especially of other people's work. A good critical comment teaches us something.

      We are hackers here. Building is good. Sharing is good. All this is true even if you personally know how to do what is being shared, and it is easy for you. I promise you there are people who encounter every sharing post here and do not think what is posted is easy.

      • By brulard 2025-03-3013:431 reply

        I think we exactly need to hear things like that. This is what I was wondering. Why is every MCP project such a big news? Isn't it just a few lines of code?

      • By asdev 2025-03-3016:15

        5% hackers, 95% FANG employees who think this is meaningless

    • By esafak 2025-03-3015:24

      You do have to implement every functionality that you want to expose.

  • By 1zael 2025-03-305:328 reply

    This is wild. Our company has like 10 data scientists writing SQL queries on our DB for business questions. I can deploy pg-mcp for my organization so everyone can use Claude to answer whatever is on their mind? (e.x."show me the top 5 customers by total sales")

    sidenote: I'm scared of what's going to happen to those roles!

    • By clusterfook 2025-03-308:292 reply

      Yep gonna be easy

      Q: show me the top 5 customers by total sales

      A: System.Data.Odbc.OdbcException (0x80131937): ERROR [57014] ERROR: canceling statement due to statement timeout;

      Q: Why do I get this error

      A: Looks like it needs an index, let me create that for you. Done. Rerunnign query.

      could not close temporary statistics file "pg_stat_tmp/global.tmp": No space left on device

      Q: Why this error

      A: 429 Too Many Requests

      Rub hands... great next 10 years to be a backend dev.

      • By curious_cat_163 2025-03-3013:41

        That’s a good example of a worst case scenario. This is why we would still need humans loitering about.

        The question is do they still need 10? Or 2 would suffice? How about 5?

        This does not need to be a debate about the absolutes.

      • By fullstackchris 2025-03-309:13

        and the next 10 after that, and the next 10 after that, and...

    • By spennant 2025-03-3012:102 reply

      It won’t be that easy. First off, most databases in the wild are not well documented. LLMs benefit from context, and if your tables/columns have non-intuitive or non-descriptive names, the SQL may not even work. Second, you might benefit from an LLM fine-tuned on writing code and/or an intelligent Agent that checks for relevancy and ambiguity in user input prior to attempting to answer the question. It would also help if the agent executed the query to see how it answered the user’s question. In other words “reasoning”… pg-mcp simply exposes the required context for Agents to do that kind of reasoning.

      • By nickdichev 2025-03-3012:54

        The COMMENT command will finally be useful :)

      • By dinfinity 2025-03-3015:59

        Then let the AI first complete the documentation by looking at the existing documentation, querying the DB (with pg-mcp), etc.

        Do human reviewing and correcting of the updated documentation. Then ensure that the AI knows that the documentation might still contain errors and ask it to do the 'actual' work.

    • By moltar 2025-03-307:541 reply

      There are LLM SQL benchmarks. [1] And state of the art solution is still only at 77% accuracy. Would you trust that?

      [1] https://bird-bench.github.io/

    • By otabdeveloper4 2025-03-306:572 reply

      Probably nothing. "Expose the database to the pointy-haired boss directly, as a service" is an idea as old a computing itself. Even SQL itself was originally an iteration of that idea. Every BI system (including PowerBI and Tableau) were supposed to be that.

      It doesn't work because the PHB doesn't have the domain knowledge and doesn't know which questions to ask. (No, it's never as simple as group-by and top-5.)

      • By jaccola 2025-03-307:24

        I would say SQL still is that! My wife had to learn some SQL to pull reports in some non-tech finance job 10 years ago. (I think she still believes this is what I do all day…)

        I suppose this could be useful in that it prevents everyone in the company having to learn even the basics of SQL which is some barrier, however minimal.

        Also the LLM will presumably be able to see all the tables/fields and ‘understand’ them (with the big assumption that they are even remotely reasonably named) so English language queries will be much more feasible now. Basically what LLMs have over all those older attempts is REALLY good fuzziness.

        I see this being useful for some subset of questions.

      • By pclmulqdq 2025-03-3012:501 reply

        A family friend maintains a SQL database of her knitting projects that she does as a hobby. The PHB can easily learn SQL if they want.

    • By a-dub 2025-03-307:431 reply

      is that true? i'd like that, but i get the sense that this mcp stuff is more oriented around programming assistant and agent applications.

      i suppose the desktop app can use it, but how good is it for this general purpose "chat with the database for lightweight analytics" use cases is it worth the trouble of dealing with some electron app to make it work?

      • By sshine 2025-03-3012:50

        > i get the sense that this mcp stuff is more oriented around programming assistant and agent applications

        Agents will become ubiquitous parts of the user interface that is currently the chat.

        So if you bother with a website or an electron app now, MCP will just add more capabilities to what you can control using agents.

    • By Kiro 2025-03-307:131 reply

      There are hundreds of text-to-SQL companies and integrations already. What's different about this that makes you react like that?

      • By romanovcode 2025-03-307:472 reply

        Those companies will be dead once this goes mainstream. Why pay to a 3rd party company when you can ask LLM to create graphs and analysis of whatever you want. Pair it with scheduled tasks and I really don't see any value in those SaaS products.

    • By risyachka 2025-03-3010:40

      So you will ask "What is our churn?", get a random result, and then turn your whole marketing strategy around wrong number?

      Thats cute.

    • By slt2021 2025-03-308:52

      didn't Tableau (and some other BI solutions) have this feature out of the box?

  • By fulafel 2025-03-307:531 reply

    From docker-compose

        ports:
          - "8000:8000"
    
    This will cause Docker to expose this to the internet and even helpfully configure an allow rule to the host firewall, at least on Linux.

    • By rubslopes 2025-03-3010:322 reply

      Good catch.

      OP, exposing your application without authentication is a serious security risk!

      Quick anecdote: Last week, I ran a Redis container on a VPS with an exposed port and no password (rookie mistake). Within 24 hours, the logs revealed someone attempting to make my Redis instance a slave to theirs! The IP traced back to Tencent, the Chinese tech giant... Really weird. Fortunately, there was nothing valuable stored in it.

      • By acheong08 2025-03-3011:55

        > The IP traced back to Tencent, the Chinese tech giant... Really weird.

        They're a large cloud provider in Asia like Amazon AWS or Microsoft Azure. I doubt such a tech company would make it that obvious when breaking the law.

      • By spennant 2025-03-3012:27

        I made a few assumptions about the actual deployer and their environment that I shouldn’t have… I’ll need to address this. Thanks!

HackerNews