Connecting Your LLMs to Snowflake: A Guide to Installing the Snowflake MCP Server
Introduction: Empowering Models with Enterprise Data via MCP
The landscape of artificial intelligence, particularly Large Language Models (LLMs), is constantly pushing boundaries. However, a common limitation is their inability to access timely, specific, or proprietary data locked away in enterprise data warehouses. While pre-trained models hold vast general knowledge, they often lack the context required for tasks demanding insights from specific datasets, such as internal sales figures, customer support logs, or real-time inventory levels.
The Model Context Protocol (MCP) emerges as a crucial standard to address this gap. MCP defines a structured way for AI models (or the applications orchestrating them) to request and receive context from diverse data sources. It acts as a universal translator, enabling a model to ask, "What were the top 10 products sold in the EMEA region last quarter?" and receive a structured answer directly from a connected data source, rather than relying solely on its generalized training data.
Snowflake has become a dominant force in cloud data warehousing, offering a powerful platform for storing, processing, and analyzing vast amounts of structured and semi-structured data. The Snowflake MCP Server, specifically the implementation by Isaac Wasserman, serves as a dedicated bridge between the MCP standard and your Snowflake data warehouse. It listens for MCP requests, translates them into appropriate Snowflake SQL queries, executes those queries against your Snowflake instance, and returns the results formatted as MCP context responses.
Tired of Postman? Want a decent postman alternative that doesn't suck?
Apidog is a powerful all-in-one API development platform that's revolutionizing how developers design, test, and document their APIs.
Unlike traditional tools like Postman, Apidog seamlessly integrates API design, automated testing, mock servers, and documentation into a single cohesive workflow. With its intuitive interface, collaborative features, and comprehensive toolset, Apidog eliminates the need to juggle multiple applications during your API development process.
Whether you're a solo developer or part of a large team, Apidog streamlines your workflow, increases productivity, and ensures consistent API quality across your projects.
By deploying this server, you unlock significant capabilities:
- Direct Snowflake Access: Enable models to query data directly from specified Snowflake tables and views in real time.
- Contextual Prompt Engineering: Dynamically inject relevant data points from Snowflake into model prompts for more accurate and relevant responses.
- Data-Driven Insights: Allow models to perform analysis, summarization, or answer questions grounded in the specific data within your Snowflake warehouse.
- Standardized Data Interaction: Leverage the MCP standard for consistent communication between models and various context providers, including Snowflake.
This tutorial provides a detailed walkthrough for installing and configuring the Snowflake MCP Server. We will cover prerequisites related to Snowflake and your local environment, setting up Snowflake resources, cloning the code, configuring the server, and finally, running and testing the installation. This guide assumes a basic understanding of command-line interfaces, Snowflake concepts (users, roles, warehouses), and potentially the Go programming language environment (as many such servers are written in Go, though verify this based on the project's specifics).
The source code and primary documentation for the server can be found at:
https://github.com/isaacwasserman/mcp-snowflake-server
Let's begin integrating your Snowflake data with your AI models!
Prerequisites
Before installing the Snowflake MCP Server, ensure the following are in place:
- Snowflake Account: You need access to a Snowflake account. You'll require your Snowflake Account Identifier. (Find this in your Snowflake URL:
your-account-identifier.snowflakecomputing.com
). - Snowflake User Credentials: A dedicated Snowflake user account is highly recommended for the MCP server to connect with. This enhances security and auditability. You will need:
- Username: The login name for the Snowflake user (e.g.,
MCP_USER
). - Authentication Method: Choose one:
- Password: The password associated with the Snowflake user.
- Key-Pair Authentication (Recommended): A more secure method involving generating a private/public key pair. You'll need the path to the unencrypted private key file and potentially a passphrase if the key itself is encrypted.
- Username: The login name for the Snowflake user (e.g.,
- Snowflake Resources & Permissions: The Snowflake user must have appropriate permissions:
- Role: A designated Snowflake role (e.g.,
MCP_ROLE
) assigned to the user. - Warehouse: Access to a virtual warehouse for executing queries (e.g.,
MCP_WAREHOUSE
). The user/role needsUSAGE
privilege on the warehouse. - Database & Schema: Access to the specific database(s) and schema(s) containing the data the MCP server needs to query. The user/role needs
USAGE
privilege on the database and schema. - Tables/Views:
SELECT
privileges on the specific tables or views within the allowed schemas that the server will query. Adhere to the principle of least privilege – only grant access to necessary data.
- Role: A designated Snowflake role (e.g.,
- Runtime Environment: The server implementation likely requires a specific runtime. Check the project's README (
github.com/isaacwasserman/mcp-snowflake-server
) for the exact language (e.g., Go, Python, Node.js).- If Go: Install Go (version 1.18 or later often recommended). Download from golang.org. Verify with
go version
. - If Python: Install Python (usually Python 3.x). Check project requirements for specific version/packages (
requirements.txt
). - If Node.js: Install Node.js (LTS version recommended). Check project requirements (
package.json
).
- If Go: Install Go (version 1.18 or later often recommended). Download from golang.org. Verify with
- Git: Required to clone the repository. Install from git-scm.com/downloads if needed. Verify with
git --version
.
Step 1: Setting Up Snowflake Resources
This step involves creating the dedicated user, role, and granting permissions within Snowflake. Using Snowsight (the Snowflake web UI) or SnowSQL (the command-line client) is recommended.
Connect to Snowflake: Log in with a user having
ACCOUNTADMIN
orSECURITYADMIN
privileges (or sufficient privileges to create users, roles, and grant permissions).Create a Role (Recommended):
USE ROLE SECURITYADMIN; -- Or ACCOUNTADMIN CREATE ROLE IF NOT EXISTS MCP_ROLE COMMENT = 'Role for MCP Server access'; GRANT ROLE MCP_ROLE TO ROLE SYSADMIN; -- Or grant to ACCOUNTADMIN; allows management
Create a Warehouse (If Needed): If you don't have a suitable warehouse:
USE ROLE SYSADMIN; -- Or ACCOUNTADMIN CREATE WAREHOUSE IF NOT EXISTS MCP_WAREHOUSE WAREHOUSE_SIZE = 'XSMALL' AUTO_SUSPEND = 60 -- Suspend after 60 seconds of inactivity AUTO_RESUME = TRUE INITIALLY_SUSPENDED = TRUE COMMENT = 'Warehouse for MCP Server'; GRANT USAGE ON WAREHOUSE MCP_WAREHOUSE TO ROLE MCP_ROLE;
Create a User:
USE ROLE SECURITYADMIN; -- Or ACCOUNTADMIN CREATE USER IF NOT EXISTS MCP_USER PASSWORD = 'your-strong-password' -- Replace with a very strong password if using password auth LOGIN_NAME = 'MCP_USER' DISPLAY_NAME = 'MCP Server User' DEFAULT_WAREHOUSE = 'MCP_WAREHOUSE' DEFAULT_ROLE = 'MCP_ROLE' MUST_CHANGE_PASSWORD = FALSE -- Set to TRUE for initial login if desired, but may complicate server setup COMMENT = 'User account for the MCP Server'; GRANT ROLE MCP_ROLE TO USER MCP_USER;
Security Note: Using password authentication directly in configurations is less secure. Key-pair authentication is strongly preferred.
Configure Key-Pair Authentication (Recommended):
- Generate Keys: Use OpenSSL (install if needed). Generate an unencrypted private key and a public key.
Note: Protect the private key file (# Generate an unencrypted 2048-bit RSA private key openssl genrsa -out ~/.ssh/snowflake_mcp_key.p8 2048 # Generate the corresponding public key openssl rsa -in ~/.ssh/snowflake_mcp_key.p8 -pubout -out ~/.ssh/snowflake_mcp_key.pub
snowflake_mcp_key.p8
)! Store it securely. Add a passphrase during generation (-aes256
flag) and provide it during configuration if needed, but ensure the server supports passphrase-protected keys. The example above creates an unencrypted key for simplicity in server configuration. - Extract Public Key Content: Copy the content of the public key file (
snowflake_mcp_key.pub
), excluding the-----BEGIN PUBLIC KEY-----
and-----END PUBLIC KEY-----
lines. It should be a single block of text. - Assign Public Key to User: Execute this SQL in Snowflake:
USE ROLE SECURITYADMIN; -- Or ACCOUNTADMIN ALTER USER MCP_USER SET RSA_PUBLIC_KEY='PASTE_PUBLIC_KEY_CONTENT_HERE'; -- Example: ALTER USER MCP_USER SET RSA_PUBLIC_KEY='MIIBIjANBgkqh...';
- Note Down: Keep track of the full path to the private key file (
~/.ssh/snowflake_mcp_key.p8
in the example).
- Generate Keys: Use OpenSSL (install if needed). Generate an unencrypted private key and a public key.
Grant Data Access Permissions: Grant the
MCP_ROLE
the necessary privileges on the data it needs to query.USE ROLE SECURITYADMIN; -- Or ACCOUNTADMIN -- Grant usage on the specific database(s) GRANT USAGE ON DATABASE YOUR_DATABASE TO ROLE MCP_ROLE; -- Grant usage on the specific schema(s) within that database GRANT USAGE ON SCHEMA YOUR_DATABASE.YOUR_SCHEMA TO ROLE MCP_ROLE; -- Grant SELECT on specific table(s) or view(s) GRANT SELECT ON TABLE YOUR_DATABASE.YOUR_SCHEMA.YOUR_TABLE TO ROLE MCP_ROLE; -- Or grant on all tables in a schema (use with caution) -- GRANT SELECT ON ALL TABLES IN SCHEMA YOUR_DATABASE.YOUR_SCHEMA TO ROLE MCP_ROLE; -- Grant SELECT on specific view(s) GRANT SELECT ON VIEW YOUR_DATABASE.YOUR_SCHEMA.YOUR_VIEW TO ROLE MCP_ROLE;
Replace
YOUR_DATABASE
,YOUR_SCHEMA
,YOUR_TABLE
,YOUR_VIEW
with your actual object names. Repeat grants as needed for multiple objects.Gather Connection Details: Make sure you have noted:
- Snowflake Account Identifier (e.g.,
xy12345.us-west-2
) - MCP Username (
MCP_USER
) - MCP User Password (if using password auth)
- Or Full Path to Private Key File (if using key-pair auth)
- Private Key Passphrase (if the key is encrypted, leave blank if not)
- MCP Role (
MCP_ROLE
) - MCP Warehouse (
MCP_WAREHOUSE
) - Target Database(s) and Schema(s)
- Snowflake Account Identifier (e.g.,
Step 2: Installing Local Dependencies
Install the required runtime (Go/Python/Node.js) and Git based on the project's requirements.
- Install Runtime: Follow instructions for Go, Python, or Node.js as determined from the project's README. Verify the installation (e.g.,
go version
,python --version
,node --version
). - Install Git: Install from git-scm.com/downloads if necessary. Verify:
git --version
.
Step 3: Cloning the Repository
Download the server code from GitHub.
- Open your terminal/command prompt.
- Navigate to your preferred projects directory (e.g.,
cd ~/projects
). - Clone the repository:
git clone https://github.com/isaacwasserman/mcp-snowflake-server.git
- Change into the project directory:
cd mcp-snowflake-server
Step 4: Configuration
The Snowflake MCP Server will need your Snowflake connection details. Configuration is often managed via environment variables or a configuration file (config.yaml
, .env
). Check the project's README for the definitive method. We'll outline common approaches.
Method A: Environment Variables (Common)
Many server applications use environment variables for configuration, which is convenient for containerized deployments and avoids hardcoding secrets.
Identify Required Variables: Look for required environment variables in the project's README. Common variables might include:
SNOWFLAKE_ACCOUNT
: Your Snowflake account identifier.SNOWFLAKE_USER
: The MCP username (MCP_USER
).SNOWFLAKE_ROLE
: The MCP role (MCP_ROLE
).SNOWFLAKE_WAREHOUSE
: The MCP warehouse (MCP_WAREHOUSE
).MCP_PORT
orPORT
: The network port for the server (e.g.,8080
,9090
).- For Password Auth:
SNOWFLAKE_PASSWORD
: The MCP user's password.
- For Key-Pair Auth:
SNOWFLAKE_PRIVATE_KEY_PATH
: Full path to the private key file (e.g.,/home/user/.ssh/snowflake_mcp_key.p8
).SNOWFLAKE_PRIVATE_KEY_PASSPHRASE
: The passphrase for the private key (leave unset or empty if the key is not encrypted).
Set Environment Variables: You can set these directly in your shell session (they will disappear when the session closes) or use a
.env
file if the application supports it (using libraries likegodotenv
orpython-dotenv
).- Directly in Shell (Linux/macOS):
export SNOWFLAKE_ACCOUNT="your-account-id" export SNOWFLAKE_USER="MCP_USER" # ... set other variables ... export SNOWFLAKE_PRIVATE_KEY_PATH="/path/to/your/private/key.p8" export MCP_PORT="9090" # Run the server command in the same shell
- Directly in Shell (Windows CMD):
set SNOWFLAKE_ACCOUNT=your-account-id set SNOWFLAKE_USER=MCP_USER rem ... set other variables ... set SNOWFLAKE_PRIVATE_KEY_PATH="C:\path\to\your\private\key.p8" set MCP_PORT=9090 rem Run the server command
- Using
.env
File: Create a file named.env
in the project root:
Make sure this file is listed in yourSNOWFLAKE_ACCOUNT=your-account-id SNOWFLAKE_USER=MCP_USER SNOWFLAKE_ROLE=MCP_ROLE SNOWFLAKE_WAREHOUSE=MCP_WAREHOUSE # Choose one auth method: # SNOWFLAKE_PASSWORD=your-strong-password SNOWFLAKE_PRIVATE_KEY_PATH=/path/to/your/private/key.p8 # SNOWFLAKE_PRIVATE_KEY_PASSPHRASE=your-key-passphrase # Only if key is encrypted MCP_PORT=9090
.gitignore
to avoid committing secrets. The application needs to be coded to load this file automatically.
- Directly in Shell (Linux/macOS):
Method B: Configuration File (e.g., config.yaml
)
If the project uses a configuration file:
- Locate and Copy: Find a sample file (e.g.,
config.yaml.example
) and copy it to the required name (e.g.,config.yaml
). - Edit the File: Open the file and fill in the Snowflake connection details and server settings (port, etc.) similar to the environment variable names above. Pay close attention to the structure (indentation in YAML).
- Secure the File: Ensure appropriate file permissions restrict access, especially if storing passwords or key paths directly.
Security: Never commit passwords or private keys directly into your version control system (Git). Use environment variables, .env
files added to .gitignore
, or secure secret management solutions.
Step 5: Building/Preparing the Server
Depending on the language used:
If Go:
- Ensure you are in the project directory (
cd mcp-snowflake-server
). - Download dependencies and build the executable:
go mod download # Optional, go build usually handles this go build .
- This creates an executable file (e.g.,
mcp-snowflake-server
ormcp-snowflake-server.exe
).
- Ensure you are in the project directory (
If Python:
- Ensure you are in the project directory.
- Set up a virtual environment (recommended):
python -m venv venv source venv/bin/activate # Linux/macOS # or .\venv\Scripts\activate # Windows
- Install dependencies:
pip install -r requirements.txt
- There might not be a separate "build" step; you'll run the main Python script directly.
If Node.js:
- Ensure you are in the project directory.
- Install dependencies:
npm install
- There might be a build step (
npm run build
) if using TypeScript or a bundler, or you might run the main JavaScript file directly. Check thepackage.json
scripts.
Step 6: Running the Server
Execute the server application.
- Ensure Configuration is Set: Double-check that environment variables are exported in your current session, or the
.env
/config.yaml
file is correctly populated and in the right location. - Run:
- Go Executable:
./mcp-snowflake-server # Linux/macOS .\mcp-snowflake-server.exe # Windows
- Python Script: (Assuming the main file is
main.py
orapp.py
)python main.py
- Node.js Script: (Check
package.json
'sstart
script or the main file)node server.js # Or: npm start
- Go Executable:
- Monitor Output: Watch the terminal for log messages. Expect to see:
- Confirmation that configuration is loaded.
- A message indicating the server is listening on the configured port (e.g.,
Server listening on port 9090
). - Potentially initial connection tests or status messages related to Snowflake (though connection often happens lazily on the first request).
- Keep Alive: The server runs in the foreground. Use
Ctrl + C
to stop it. For production or long-term use, employ process managers (systemd
,pm2
for Node.js) or containerization (Docker) for reliability and background execution.
Step 7: Testing the Server
Send an MCP request to the running server to verify it can query Snowflake.
Refer to Documentation: Check the
isaacwasserman/mcp-snowflake-server
README or the MCP specification for the exact JSON request format this server expects. It usually involves a POST request to an endpoint like/context
.Use
curl
(or similar tool): Open a new terminal.Example
curl
Request (Hypothetical - Adapt based on actual server API): Assume you want to runSELECT product_name, sales_count FROM YOUR_DATABASE.YOUR_SCHEMA.SALES_SUMMARY WHERE region = 'NA' ORDER BY sales_count DESC LIMIT 5;
curl -X POST http://localhost:9090/context \ -H "Content-Type: application/json" \ -d '{ "requester": { "client_id": "test-suite" }, "requested_context": { "type": "snowflake/sql", "query": "SELECT product_name, sales_count FROM YOUR_DATABASE.YOUR_SCHEMA.SALES_SUMMARY WHERE region = ''NA'' ORDER BY sales_count DESC LIMIT 5;" } }'
Important Adjustments:
- Replace
http://localhost:9090/context
with the actual host, port, and endpoint. - Replace the SQL query with a valid query for your Snowflake tables/views accessible by the
MCP_USER
/MCP_ROLE
. Ensure proper SQL syntax and escaping (note the double single quotes for the string literal'NA'
within the JSON string). - Verify the JSON structure (
requester
,requested_context
,type
,query
) matches what this specific server expects.
- Replace
Check Response: A successful request should return a JSON response containing the query results, formatted according to the MCP spec (e.g., perhaps a list of objects or tabular data within
context_data
).Check Server Logs: Examine the terminal where the server is running. You should see logs indicating:
- An incoming request received.
- Attempting to connect to Snowflake (if not already connected).
- Executing the SQL query.
- Returning the results or any errors encountered (authentication failure, SQL error, permission denied, etc.).
Troubleshooting Common Issues
- Snowflake Authentication Errors:
- Double-check
SNOWFLAKE_ACCOUNT
,SNOWFLAKE_USER
,SNOWFLAKE_ROLE
,SNOWFLAKE_WAREHOUSE
. - Password: Verify the password is correct.
- Key-Pair: Ensure
SNOWFLAKE_PRIVATE_KEY_PATH
is the correct full path to the private key file. Check file permissions (server process must be able to read it). Verify the public key was correctly assigned to the user in Snowflake. If using a passphrase, ensureSNOWFLAKE_PRIVATE_KEY_PASSPHRASE
is set correctly. Check if the server requires an unencrypted private key.
- Double-check
- Connection Errors:
- Verify the Snowflake Account Identifier format.
- Check network connectivity from the server host to your Snowflake instance (firewalls, proxies).
- Configuration Not Loaded:
- Env Vars: Ensure variables are correctly exported in the same shell session where the server is run, or that the
.env
mechanism is working. - Config File: Confirm the file name and location are correct and the format is valid (e.g., proper YAML indentation).
- Env Vars: Ensure variables are correctly exported in the same shell session where the server is run, or that the
- Build/Dependency Issues:
- Ensure the correct runtime version (Go/Python/Node.js) is installed.
- Run the dependency installation command (
go mod download
,pip install
,npm install
) again. Check for network errors.
- Snowflake Query Errors:
Object does not exist or not authorized
: Verify theMCP_ROLE
hasUSAGE
on the Warehouse, Database, Schema andSELECT
on the Table/View. Check for typos in the query's object names. Ensure the user's default database/schema or fully qualified names are used correctly in the query.SQL compilation error
: Check the SQL syntax sent in the MCP request.
Security Considerations
- Snowflake Credentials: Prioritize key-pair authentication. Securely store the private key file with strict permissions. Avoid passwords in plain text config files or environment variables where possible; use secret management systems (like AWS Secrets Manager, GCP Secret Manager, HashiCorp Vault) in production. Grant only the necessary Snowflake permissions (least privilege).
- Network Exposure: Restrict network access to the MCP server port using firewalls. Only allow connections from trusted sources (e.g., your application servers hosting the LLMs). Consider binding the server only to specific network interfaces (
localhost
if possible). - Transport Security (HTTPS): In production, terminate TLS/SSL at a load balancer or reverse proxy (like Nginx, Caddy, Traefik) placed in front of the MCP server to encrypt traffic.
- Input Sanitization/SQL Injection: Since the server executes SQL queries potentially derived from external input, it's critical to understand if the server implementation includes safeguards against SQL injection. If the server allows arbitrary queries, the calling application bears responsibility for ensuring only safe queries are sent. Ideally, the server might offer parameterized queries or strict allowlisting/validation if it constructs SQL itself. Review the project's security posture.
- Server-Side Authentication: Consider adding an authentication layer (e.g., API Keys, OAuth tokens) to the MCP server itself, so only authorized clients (your LLM applications) can make requests, even within a trusted network.
Conclusion
Congratulations! You have navigated the process of setting up the necessary Snowflake resources and installing the Snowflake MCP Server (isaacwasserman/mcp-snowflake-server
). This server provides a powerful mechanism to bridge your AI models with the rich, specific data residing in your Snowflake data warehouse using the standardized Model Context Protocol.
You can now empower your models with real-time, contextual information, leading to more accurate, relevant, and data-grounded responses. Remember to consult the specific project repository's README for the most accurate details on configuration, expected MCP request formats, and any advanced features. Always prioritize security, especially credential management and query safety, when deploying such a bridge to your valuable enterprise data.