Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

gRPC PERMISSION_DENIED Error & Streaming Hangs in Databricks on Google Cloud

I'm running multiple streaming jobs on Databricks in the Google Cloud environment and am encountering issues that appear to be related to gRPC session management.

I implemented a custom streaming query listener to capture job status events and upsert them into a Delta table. This listener works fine when processing a single streaming table (upsert to a Delta table in a GCS bucket), but when I run approximately 10 or more streaming tables concurrently, I receive this error:

warnings.warn(f"Listener {str(listener)} threw an exception\n{e}")
/databricks/spark/python/pyspark/sql/connect/streaming/query.py:561: UserWarning: Listener <...CustomStreamingQueryListener object...> threw an exception
<_InactiveRpcError of RPC that terminated with:
    status = StatusCode.PERMISSION_DENIED
    details = "Local RPC without associated session."
    debug_error_string = "UNKNOWN:Error received from peer ..." >

This suggests that there might be an issue with gRPC—specifically, it seems related to session management or permissions—when handling multiple streaming queries on Google Cloud.

Questions:

  1. Resolving the gRPC Error:
    How can I address the "Local RPC without associated session" (PERMISSION_DENIED) error when performing Delta upserts in the onQueryIdle event? Could this be due to gRPC session management issues between Databricks and Google Cloud, and are there configuration tweaks or best practices to mitigate this error in a high-concurrency environment?

Note: If you'd like to see the full code context, please refer to my detailed post on the Databricks Community forum here.

Solved Solved
0 2 162
1 ACCEPTED SOLUTION

Hi @minhhung10l !
This gRPC PERMISSION_DENIED error with "no associated session" usually happens when the Spark driver loses its active authentication context during long-running or highly concurrent streaming workloads.
For Databricks on Google Cloud, a best practice is to refresh service account credentials more frequently or use a Service Account Key authentication instead of relying on default credentials. Also, scaling the driver (higher memory/cpu) can help avoid session drops under heavy load.
Hope this points you in the right direction!

View solution in original post

2 REPLIES 2

Hi @minhhung10l !
This gRPC PERMISSION_DENIED error with "no associated session" usually happens when the Spark driver loses its active authentication context during long-running or highly concurrent streaming workloads.
For Databricks on Google Cloud, a best practice is to refresh service account credentials more frequently or use a Service Account Key authentication instead of relying on default credentials. Also, scaling the driver (higher memory/cpu) can help avoid session drops under heavy load.
Hope this points you in the right direction!

Hi @a_aleinikov ,

Thank you very much for your detailed explanation, and my apologies for the delayed reply.

I’d like to make sure I fully understand the approaches you mentioned for preventing the “no associated session” error:

  1. Refresh credentials more frequently

    • Could you share any recommended patterns or tools for automating token refresh in a Databricks-on-GCP environment? For example, is it best to schedule a regular gcloud auth application-default print-access-token call, or is there a built-in Spark/Databricks mechanism you’d recommend?

  2. Use a Service Account Key instead of ADC

    • Am I correct in understanding that supplying a JSON key file when initializing my Databricks Connect session will completely avoid the OAuth token-expiration issue?

    • Are there any security or operational caveats I should be aware of when adopting a Service Account Key in production?

Would those two options cover all the best practices for maintaining an active authentication context in long-running or highly concurrent streaming workloads? Or do you recommend any additional safeguards?

Thank you again for your time and expertise.

Top Labels in this Space