Exceeded maximum of 100 connections per instance

I am currently facing a recurring issue with my Cloud Run application which is configured to connect to a MySQL instance. Occasionally, I encounter the following error: "Exceeded maximum of 100 connections per instance." This issue arises despite using Django, which I understand should automatically manage and close database connections once a request is completed.

Here are some details about my setup:

  • Cloud Service: Google Cloud Run
  • Database: MySQL on Google Cloud SQL
  • Framework: Django

I am seeking guidance on why this error might be occurring and how to resolve it. I believe all necessary configurations for connection pooling and proper session management are in place, yet the problem persists. Could there be an underlying issue with how Django interfaces with Cloud SQL, or is it possible that my application configuration needs further optimization?

Any insights or recommendations you could provide would be greatly appreciated. Please let me know if there are logs or further details needed to assist you in diagnosing this issue more effectively.

Thank you for your time and assistance.

Best regards,

Solved Solved
3 2 95
1 ACCEPTED SOLUTION

Hi @message-checker ,

The connection limit error you're encountering with your Cloud Run application and MySQL instance. It's a common challenge, but there are definitely ways to address it. Here are several strategies we can employ to tackle this issue effectively:

1. Django Connection Management:

  • CONN_MAX_AGE: This setting controls how long Django keeps connections open. If it's too high relative to your Cloud Run instance lifespans, connections might pile up. Try setting it to 0 to disable persistent connections, or adjust it to a value that aligns with your instance durations.

  • Explicit Closure: Even with pooling, sometimes connections aren't closed as expected. Adding middleware or a signal to explicitly close connections after each request is a solid practice:

     
    from django.db import connection
    
    def close_db_connection(response):
        connection.close()
        return response
    

2. Cloud SQL Configuration:

  • Machine Type: If your MySQL instance is under heavy load, upgrading to a machine type with higher connection capacity might be necessary. Check your instance's current specs and compare them to your workload.
  • Connection Pooling: If you're using PgBouncer or ProxySQL, make sure they're configured correctly for Cloud Run's environment and limitations. Their documentation should provide guidance on this.

3. Cloud Run Scaling:

  • Max Instances: Setting a reasonable limit on Cloud Run instances can help control the total number of connections. This prevents runaway scaling from overwhelming your database.
  • Autoscaling: Use Cloud Run's autoscaling feature, potentially based on database connection metrics, to dynamically adjust instances based on demand. This helps avoid connection spikes.

4. Monitoring and Performance Optimization:

  • Cloud Monitoring: Actively monitor database connection metrics using Google Cloud Monitoring. This will give you insights into usage patterns and help you identify trends or anomalies.
  • Logging: Enhance your logging to track the lifecycle of database connections. This can reveal issues like unclosed connections or long-running queries.

5. Additional Considerations:

  • Serverless VPC Access: If not already in use, setting up Serverless VPC Access can improve connectivity between Cloud Run and Cloud SQL, potentially reducing latency and connection overhead.
  • Query Optimization: Review your Django ORM usage and SQL queries for potential optimizations (indexing, query structure). Efficient queries put less strain on your database.
  • Background Tasks: Offload long-running or resource-intensive operations to asynchronous services like Cloud Tasks. This frees up connections on your main instances.

 

View solution in original post

2 REPLIES 2

Hi @message-checker ,

The connection limit error you're encountering with your Cloud Run application and MySQL instance. It's a common challenge, but there are definitely ways to address it. Here are several strategies we can employ to tackle this issue effectively:

1. Django Connection Management:

  • CONN_MAX_AGE: This setting controls how long Django keeps connections open. If it's too high relative to your Cloud Run instance lifespans, connections might pile up. Try setting it to 0 to disable persistent connections, or adjust it to a value that aligns with your instance durations.

  • Explicit Closure: Even with pooling, sometimes connections aren't closed as expected. Adding middleware or a signal to explicitly close connections after each request is a solid practice:

     
    from django.db import connection
    
    def close_db_connection(response):
        connection.close()
        return response
    

2. Cloud SQL Configuration:

  • Machine Type: If your MySQL instance is under heavy load, upgrading to a machine type with higher connection capacity might be necessary. Check your instance's current specs and compare them to your workload.
  • Connection Pooling: If you're using PgBouncer or ProxySQL, make sure they're configured correctly for Cloud Run's environment and limitations. Their documentation should provide guidance on this.

3. Cloud Run Scaling:

  • Max Instances: Setting a reasonable limit on Cloud Run instances can help control the total number of connections. This prevents runaway scaling from overwhelming your database.
  • Autoscaling: Use Cloud Run's autoscaling feature, potentially based on database connection metrics, to dynamically adjust instances based on demand. This helps avoid connection spikes.

4. Monitoring and Performance Optimization:

  • Cloud Monitoring: Actively monitor database connection metrics using Google Cloud Monitoring. This will give you insights into usage patterns and help you identify trends or anomalies.
  • Logging: Enhance your logging to track the lifecycle of database connections. This can reveal issues like unclosed connections or long-running queries.

5. Additional Considerations:

  • Serverless VPC Access: If not already in use, setting up Serverless VPC Access can improve connectivity between Cloud Run and Cloud SQL, potentially reducing latency and connection overhead.
  • Query Optimization: Review your Django ORM usage and SQL queries for potential optimizations (indexing, query structure). Efficient queries put less strain on your database.
  • Background Tasks: Offload long-running or resource-intensive operations to asynchronous services like Cloud Tasks. This frees up connections on your main instances.

 

Thank you for your assistance with the recent issue we encountered regarding the maximum connection limits on our instance.

I wanted to update you that I have successfully resolved the problem by ensuring explicit closure of each connection once a request is completed. This approach has effectively prevented the "Exceeded maximum of 100 connections per instance" error that we were experiencing, despite our settings allowing for up to 10,000 active connections.

I appreciate your support and guidance. Please let me know if there are any further optimizations or recommendations you would suggest.