Hello All,
I'm building a prototype that has an API Gateway that exposes two endpoints. I am using Redis to cache data between the calling the first endpoint and the second. From the first endpoint, I am able to cache the data successfully. When I attempt to retrieve the data from the second endpoint, I receive 'Error 110 connecting to <my_ip>:6379'.
I have verified everything is in the same region, and all using default network/VPC/IP ranges. I'm new to redis, so I can't rule out redis configuration, though I am not setting any specific settings other than what memorystore provides.
My connection settings:
Authorized network
default (<my_project>)
Connection mode
Private service access
IP range
<my_ip>/29
Connection limit
15000
For troubleshooting, I have turned off Auth, in-transit encryption, and CMEK. It is basic tier with 16GB.
The VPC Connector is set up, and in the right region, using the default network and the 10.8.0.0/28 IP range.
My code is Python 3.9 in Cloud Functions v2.
First funtion:
def get_redis_connection():
redis_client = redis.Redis(host=redis_host, port=redis_port)
logger.log_text(f"host: {redis_host} port {redis_port} client {redis_client}")
return redis_client
redis_client = get_redis_connection()
redis_client.hset(str(state), mapping=login_params)
I have also tried closing this connection after the hset and creating the connection at function startup.
In the second function:
def get_redis_connection():
redis_client = redis.Redis(host=redis_host, port=redis_port)
logger.log_text(f"host: {redis_host} port {redis_port} client {redis_client}")
return redis_client
def delete_cache(state, redis_client):
redis_client.hdel(state, "iss")
redis_client.hdel(state, "login_hint")
redis_client.hdel(state, "target_link_uri")
redis_client.hdel(state, "lti_message_hint")
redis_client.hdel(state, "lti_deployment_id")
redis_client.hdel(state, "client_id")
redis_client.hdel(state, "lti_storage_target")
def get_cache_data(state):
try:
redis_client = get_redis_connection()
cache = {}
cache['iss'] = redis_client.hget(state, 'iss')
cache['login_hint'] = redis_client.hget(state, 'login_hint')
cache['target_link_uri'] = redis_client.hget(state, 'target_link_uri')
cache['lti_message_hint'] = redis_client.hget(state, 'lti_message_hint')
cache['lti_deployment_id'] = redis_client.hget(state, 'lti_deployment_id')
cache['client_id'] = redis_client.hget(state, 'client_id')
cache['lti_storage_target'] = redis_client.hget(state, 'lti_storage_target')
if cache is not None:
return cache
else:
logger.log_text(f"get_cache_data: invalid state parameter - {state}", severity="ERROR")
return None
except Exception as e:
logger.log_text(f"Error getting cache - {e}", severity="ERROR")
return None
finally:
delete_cache(state, redis_client)
redis_client.close()
I've also tried hgetall but as soon as I attempt to retrieve data, I get:
Error 110 connecting to <my_ip>:6379. Connection timed out."
Any help would be greatly appreciated.
Solved! Go to Solution.
Figured this out. I created the first function and got it working, then created the second one and mirrored all the settings. When you expand Runtime, Build, and Connection Settings and go to the Connections tab, there is a dropdown to add your VPC connector. No matter what you select here, when the dropdown is closed, it displays None. So when I mirrored all the settings, I left it at None. Hopefully this saves someone else from getting stuck here.
Figured this out. I created the first function and got it working, then created the second one and mirrored all the settings. When you expand Runtime, Build, and Connection Settings and go to the Connections tab, there is a dropdown to add your VPC connector. No matter what you select here, when the dropdown is closed, it displays None. So when I mirrored all the settings, I left it at None. Hopefully this saves someone else from getting stuck here.