I am having an access issue with MySQL in my Google Cloud Platform (GCP) instance. I have a Public IP setup to access the MySQL database. I am able to connect with SQL Workbench on my development environment from my home office. Its IP has been setup, and the user id is authorized with full access the GCP SQL database.
I am able to Create Tables with the user id and IP using SQL Workbench.
But when I try the Python interface using SQL Alchemy (SQLA) it is not consistent. The SQLA create engine and session maker commands work fine with the same user. But the SQLA metadate create that does the SQL Create Tables give me an access error.
Here's the Python SQLA error message - (pymysql.err.OperationalError) (1045, "Access denied for user 'user'@'68.nnn.nnn.nnn' (using password: YES)")
Any ideas?
The error message (pymysql.err.OperationalError) (1045, "Access denied for user 'user'@'68.nnn.nnn.nnn' (using password: YES)") indicates that the MySQL server is rejecting the connection attempt from your Python/SQLAlchemy application due to incorrect credentials or insufficient privileges. Here’s a breakdown of the possible causes and how to troubleshoot them:
Potential Causes:
Firewall Rules:
User Permissions:
SHOW GRANTS FOR 'user'@'68.nnn.nnn.nnn'; -- View existing grants
GRANT ALL PRIVILEGES ON database_name.* TO 'user'@'68.nnn.nnn.nnn'; -- Example to grant full access
Connection String:
from sqlalchemy import create_engine
connection_string = 'mysql+pymysql://user:password@public_ip_address:3306/database_name'
engine = create_engine(connection_string)
Password Issues:
Troubleshooting Steps:
Check Logs:
Simplify:
Verify Network Access:
telnet public_ip_address 3306
Additional Tips:
Thank you for your response. I have tried and checked all of the above. I still have the problem. I am able to CREATE TABLE in the MySQL database with the same user in SQL Workbench on the same machine. Does this imply that a SQLA configuration needs to be adjusted?
Given that you have verified all the potential causes and troubleshooting steps and are still experiencing the issue, it does suggest that the problem might lie within the SQLAlchemy configuration or the way it's being used in your script. Here are a few additional steps and configurations you can check:
SQLAlchemy Configuration and Usage
Connection Pooling:
engine = create_engine(
'mysql+pymysql://user:password@public_ip_address:3306/database_name',
pool_size=10, # Adjust based on your requirements
max_overflow=20 # Adjust based on your requirements
)
Session Configuration:
from sqlalchemy.orm import sessionmaker
Session = sessionmaker(bind=engine)
session = Session()
# Perform database operations
session.close() # Ensure session is closed properly
Debugging SQLAlchemy:
engine = create_engine(
'mysql+pymysql://user:password@public_ip_address:3306/database_name',
echo=True # This will log all the SQL queries
)
Sample Code to Test Connectivity and Table Creation
Here's a minimal example to test connectivity and table creation using SQLAlchemy:
from sqlalchemy import create_engine, MetaData, Table, Column, Integer, String
from sqlalchemy.orm import sessionmaker
# Connection string
DATABASE_URL = 'mysql+pymysql://user:password@public_ip_address:3306/database_name'
# Create engine
engine = create_engine(DATABASE_URL, echo=True)
# Create session
Session = sessionmaker(bind=engine)
session = Session()
# Define metadata and table
metadata = MetaData()
test_table = Table(
'test_table', metadata,
Column('id', Integer, primary_key=True),
Column('name', String(50)),
)
# Create the table
try:
metadata.create_all(engine)
print("Table created successfully.")
except Exception as e:
print(f"Error occurred: {e}")
# Close the session
session.close()
Additional Considerations
Test with Direct PyMySQL Connection
To isolate the issue, try connecting directly with PyMySQL without SQLAlchemy to see if the problem persists:
import pymysql
# Direct connection using PyMySQL
connection = pymysql.connect(
host='public_ip_address',
user='user',
password='password',
database='database_name'
)
try:
with connection.cursor() as cursor:
# Create a new record
sql = "CREATE TABLE test_direct (id INT AUTO_INCREMENT PRIMARY KEY, name VARCHAR(255))"
cursor.execute(sql)
connection.commit()
finally:
connection.close()
Thank you again for your response and suggestions. I have tried or checked them all. The code for the SQLA test and the direct PyMySQL test both had the same error: Access Denied.
I had used the same local machine (IP 68.xxx) and same user/password in SQL Workbench and it worked fine (e.g., Created a Table in MySQL on GCP). And with your example to remove SQLA from the test and use PyMySQL directly, doesn't that show that it is a Python connection problem to the GCP SQL database?
I have these installed in PyCharm:
* PyMySQL v 1.1.1
* cloud-sql-python-connector v 1.9.2
* mysql-connector-python v 8.4.0
Anything else I should check in the Python environment?
Thanks.
Given that the direct PyMySQL connection test also results in an "Access Denied" error while the same user and IP work with SQL Workbench, it suggests that there is an issue with how the Python environment or libraries are interacting with the MySQL.
Here are a few additional things to check and try:
Check Python Environment and Libraries
connection = pymysql.connect(
host='public_ip_address',
user='user',
password='password',
database='database_name',
ssl={'ssl': {'ca': '/path/to/server-ca.pem'}}
)
from mysql.connector import connect
connection = connect(
host='public_ip_address',
user='user',
password='password',
database='database_name'
)
cursor = connection.cursor()
cursor.execute("CREATE TABLE test_connector (id INT AUTO_INCREMENT PRIMARY KEY, name VARCHAR(255))")
connection.commit()
cursor.close()
connection.close()
import mysql.connector
connection = mysql.connector.connect(
host='public_ip_address',
user='user',
password='password',
database='database_name'
)
cursor = connection.cursor()
cursor.execute("CREATE TABLE test_table_connector (id INT AUTO_INCREMENT PRIMARY KEY, name VARCHAR(255))")
connection.commit()
cursor.close()
connection.close()
MySQL Authentication Plugin
ALTER USER 'user'@'68.nnn.nnn.nnn' IDENTIFIED WITH mysql_native_password BY 'password';
Environment Variables and Path Issues
Enable General Query Log on MySQL
SET GLOBAL general_log = 'ON';
SHOW VARIABLES LIKE 'general_log%';
Example General Debugging Code
Here's a final test code combining everything:
import pymysql
import mysql.connector
# PyMySQL Test
try:
# ... (same as PyMySQL test above)
except pymysql.MySQLError as e:
print(f"PyMySQL error: {e}")
# MySQL Connector Test
try:
# ... (same as MySQL Connector test above)
except mysql.connector.Error as e:
print(f"MySQL Connector error: {e}")
Summary
Yes, I added the SSL args to the pymysql connection and then it worked!
But I don't see how to add the SSL args to the SQLA commands.
I thought the Google Cloud Proxy took care of the SSL args. Is that what allowed SQL Workbench to work okay? But why not SQLA?
I tried using the connect_args parm in the SQLA create engine statement to pass the SSL args. But I got this error:
(2003, "Can't connect to MySQL server on '34.nn.nn.nn' ([SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: IP address mismatch, certificate is not valid for '34.nn.nn.nn'. (_ssl.c:1000))")
But the same SSL certificate worked for the pymysql test. I will regenerate the SSL certificate just to be sure.
The Cloud SQL Auth Proxy does handle SSL/TLS for you, but when connecting directly without the proxy, you need to manage SSL/TLS settings yourself.
To add SSL arguments to the SQLAlchemy connection, you need to pass additional parameters through the connection string. Here's how you can do it:
Install SSL Certificate: Ensure you have the necessary SSL certificates (e.g., server-ca.pem
, client-cert.pem
, client-key.pem
) from your Cloud SQL instance. You can download these from the Cloud SQL instance details in the Google Cloud Console.
SQLAlchemy Connection String with SSL: You can include SSL arguments in the SQLAlchemy connection string using query parameters.
Here's an example:
from sqlalchemy import create_engine
# Define the connection string with SSL arguments
DATABASE_URL = (
"mysql+pymysql://user:password@public_ip_address:3306/database_name?"
"ssl_ca=/path/to/server-ca.pem&"
"ssl_cert=/path/to/client-cert.pem&"
"ssl_key=/path/to/client-key.pem"
)
# Create the SQLAlchemy engine
engine = create_engine(DATABASE_URL)
# Test the connection by creating a table
from sqlalchemy import MetaData, Table, Column, Integer, String metadata = MetaData()
test_table = Table(
'test_table', metadata,
Column('id', Integer, primary_key=True),
Column('name', String(50)),
)
# Create the table
try:
metadata.create_all(engine)
print("Table created successfully.")
except Exception as e:
print(f"Error occurred: {e}")
SQL Workbench likely works because it either:
Google Cloud SQL Auth Proxy
If you prefer not to handle SSL certificates manually, you can use the Cloud SQL Auth Proxy. When you use the proxy, it handles the SSL/TLS encryption, and you connect to localhost or a Unix socket without needing to specify SSL arguments in your connection string.
Here's an example of how to use the Cloud SQL Auth Proxy:
Start the Cloud SQL Auth Proxy:
./cloud_sql_proxy -instances=<INSTANCE_CONNECTION_NAME>=tcp:3306
Connect Using SQLAlchemy:
from sqlalchemy import create_engine
# Connection string pointing to the proxy on localhost
DATABASE_URL = "mysql+pymysql://user:password@127.0.0.1:3306/database_name"
# Create the SQLAlchemy engine
engine = create_engine(DATABASE_URL)
# Test the connection by creating a table
from sqlalchemy import MetaData, Table, Column, Integer, String
metadata = MetaData()
test_table = Table(
'test_table', metadata,
Column('id', Integer, primary_key=True),
Column('name', String(50)),
)
# Create the table
try:
metadata.create_all(engine)
print("Table created successfully.")
except Exception as e:
print(f"Error occurred: {e}")
By using the Cloud SQL Auth Proxy, you can avoid managing SSL certificates manually and ensure secure connections.
I reset the SSL certificates for this GCP MySQL instance.
I recreated the SSL references and updated them for pymysql and SQL Workbench. They both still work (Creating Tables)
I added the same SSL references to the SQLA create engine, but still get this message.
(2003, "Can't connect to MySQL server on '34.nn.nn.nn' ([SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: IP address mismatch, certificate is not valid for '34.nn.nn.nn'. (_ssl.c:1000))")
I tried to also update the GCP SQL Proxy with the new SSL references but that did not seem to help.
It seems like the issue is related to SSL certificate verification, specifically an IP address mismatch. When using SSL certificates, the Common Name (CN) or Subject Alternative Name (SAN) in the certificate must match the hostname you are connecting to. In your case, the certificate does not match the IP address (34.nn.nn.nn).
To resolve this, you have a couple of options:
Option 1: Use the Cloud SQL Auth Proxy
Using the Cloud SQL Auth Proxy can simplify SSL management because it handles SSL/TLS encryption and certificate verification for you. Here's how you can set it up and use it with SQLAlchemy:
./cloud_sql_proxy -instances=<INSTANCE_CONNECTION_NAME>=tcp:3306
Replace <INSTANCE_CONNECTION_NAME>
with your Cloud SQL instance connection name, which looks like project:region:instance.
from sqlalchemy import create_engine
# Connection string pointing to the proxy on localhost
DATABASE_URL = "mysql+pymysql://user:password@127.0.0.1:3306/database_name"
# Create the SQLAlchemy engine
engine = create_engine(DATABASE_URL)
# Test the connection by creating a table
from sqlalchemy import MetaData, Table, Column, Integer, String
metadata = MetaData()
test_table = Table(
'test_table', metadata,
Column('id', Integer, primary_key=True),
Column('name', String(50)),
)
# Create the table
try:
metadata.create_all(engine)
print("Table created successfully.")
except Exception as e:
print(f"Error occurred: {e}")
Option 2: Use the Instance's Public DNS Name
If you prefer not to use the Cloud SQL Auth Proxy and want to connect directly using SSL, you should connect using the instance's public DNS name instead of its IP address. Here's how you can do it:
Find the Public DNS Name:
Update SQLAlchemy Connection String: Use the public DNS name in your connection string and add SSL parameters:
from sqlalchemy import create_engine
# Define the connection string with SSL arguments
DATABASE_URL = (
"mysql+pymysql://user:password@instance-public-dns-name:3306/database_name?"
"ssl_ca=/path/to/server-ca.pem&"
"ssl_cert=/path/to/client-cert.pem&"
"ssl_key=/path/to/client-key.pem"
)
# Create the SQLAlchemy engine
engine = create_engine(DATABASE_URL)
# Test the connection by creating a table
from sqlalchemy import MetaData, Table, Column, Integer, String
metadata = MetaData()
test_table = Table(
'test_table', metadata,
Column('id', Integer, primary_key=True),
Column('name', String(50)),
)
# Create the table
try:
metadata.create_all(engine)
print("Table created successfully.")
except Exception as e:
print(f"Error occurred: {e}")
Option 3: Bypass Certificate Validation (Not Recommended) As a last resort and generally not recommended for production due to security risks, you can disable SSL certificate verification by adding a parameter to your connection string. This should only be used for testing purposes:
from sqlalchemy import create_engine
# Define the connection string with SSL arguments and disable certificate verification
DATABASE_URL = (
"mysql+pymysql://user:password@public_ip_address:3306/database_name?"
"ssl_ca=/path/to/server-ca.pem&"
"ssl_cert=/path/to/client-cert.pem&"
"ssl_key=/path/to/client-key.pem&"
"ssl_verify_cert=false"
)
# Create the SQLAlchemy engine
engine = create_engine(DATABASE_URL)
# Test the connection by creating a table
from sqlalchemy import MetaData, Table, Column, Integer, String
metadata = MetaData()
test_table = Table(
'test_table', metadata,
Column('id', Integer, primary_key=True),
Column('name', String(50)),
)
# Create the table
try:
metadata.create_all(engine)
print("Table created successfully.")
except Exception as e:
print(f"Error occurred: {e}")
Summary
The use of the Proxy does make sense. I tested it but received this error.
(2003, "Can't connect to MySQL server on '127.0.0.1' ([WinError 10061] No connection could be made because the target machine actively refused it)")
The error message "(2003, "Can't connect to MySQL server on '127.0.0.1' ([WinError 10061] No connection could be made because the target machine actively refused it)")" suggests that the Cloud SQL Auth Proxy is not running or not configured correctly on your local machine. Here are the steps to ensure the proxy is set up and running correctly.
Steps to Set Up and Run the Cloud SQL Auth Proxy
Set Up Authentication:
Ensure you have the correct authentication set up. You can use a service account key or authenticate with gcloud.
If using a service account key, download the key JSON file and set the environment variable:
set GOOGLE_APPLICATION_CREDENTIALS="path/to/your-service-account-key.json"
Start the Cloud SQL Auth Proxy:
Open a terminal or command prompt and navigate to the directory where the proxy executable is located.
Start the proxy with the instance connection name of your Cloud SQL instance:
./cloud_sql_proxy -instances=<INSTANCE_CONNECTION_NAME>=tcp:3306
Ensure that the instance connection name is in the format project:region:instance. Replace <INSTANCE_CONNECTION_NAME> with your actual instance connection name.
Check the Proxy Status:
Ensure that the proxy is running and listening on the specified port (3306).
SQLAlchemy Connection String
Once the proxy is running, update your SQLAlchemy connection string to point to 127.0.0.1:
from sqlalchemy import create_engine
# Connection string pointing to the proxy on localhost
DATABASE_URL = "mysql+pymysql://user:password@127.0.0.1:3306/database_name"
# Create the SQLAlchemy engine
engine = create_engine(DATABASE_URL)
# Test the connection by creating a table
from sqlalchemy import MetaData, Table, Column, Integer, String
metadata = MetaData()
test_table = Table(
'test_table', metadata,
Column('id', Integer, primary_key=True),
Column('name', String(50)),
)
# Create the table
try:
metadata.create_all(engine)
print("Table created successfully.")
except Exception as e:
print(f"Error occurred: {e}")
Troubleshooting Tips
Check Firewall Rules:
Ensure that your local firewall is not blocking the connection to 127.0.0.1:3306.
Check Proxy Logs:
Look at the logs generated by the Cloud SQL Auth Proxy for any errors or messages that might indicate why the connection is being refused.
Verify Port Usage:
Ensure that port 3306 is not being used by another application on your local machine. You can check this using the following command:
netstat -ano | findstr 3306
Example Cloud SQL Auth Proxy Command
Here’s an example command to run the Cloud SQL Auth Proxy:
./cloud_sql_proxy -instances=my-project:us-central1:my-instance=tcp:3306
Thank you for all your help. The final part of the fix was to add the Cloud SQL role(s) in the Cloud IAM section. Then I reset the proxy on the client and now it works!
Thank you again for all your help!