Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Bigquery Data Transfer from Salesforce Internal Errors

I have recently encountered the following errors:

Transfer run failed due to internal errors. Please try again later.

or I got also: 
UNABLE_TO_RETRY: A retriable error could not be retried because a stream payload was received from the server (see go/xs-unable-to-retry-message-received) (old status: INTERNAL: Exception was thrown by the Connector implementation: Failed to read next record : Failed to query results com.google.cloud.hosted.integrationservice.common.ConnectorStatusException

We tested the transfer for multiple tables have, many of them have succeed, but someone give always the same error.  However, the configuration is the same for all of these tables and there is no difference from a custom or a standard salesforce object. 

Do you have some hints regarding these errors? 

Thanks in advance!

The salesforce API has also been configured properly as it works through the other platform.

Unfortuntately, there's no specific error cause in the bigquery data transfer log.

1 3 687
3 REPLIES 3

AndrewB
Community Manager
Community Manager

Hi @DariotraSec - since the BigQuery Data Transfer Service for Salesforce is still in Preview, you can contact dts-preview-support@google.com for help.

Hi @DariotraSec it looks like you're facing two key challenges with your Salesforce to BigQuery data transfer:

  1. "Transfer run failed due to internal errors"
  2. "UNABLE_TO_RETRY" and related ConnectorStatusException errors

These problems can result from a variety of causes, such as API limits, mismatched configurations, data size, or even subtle differences in table structures or metadata. Let’s work through the solutions step by step:

1. Check API Limits and Quotas

Salesforce imposes limits on API calls, and exceeding these can disrupt data transfers. Here’s how to check:

  • Monitor Your API Usage:
    Go to Setup > System Overview in Salesforce or use the REST API:

     
    GET /services/data/vXX.0/limits
  • Solution for Limits:
    If you're nearing the API cap:

    • Schedule transfer jobs to run during less busy times.
    • Consider requesting a higher API quota from Salesforce.

2. Investigate Table-Specific Differences

Even if the configuration appears identical for all tables, hidden discrepancies can cause issues:

  • Field-Level Data Issues:
    Some fields may contain nulls, special characters, or exceed BigQuery’s limits (e.g., string length). Query the problematic table directly in Salesforce and inspect its data.

  • Data Volume:
    Large tables may experience timeouts or slow performance. Use incremental transfers to fetch data in smaller chunks.

3. Use Incremental Loading

For large tables, switch to incremental loading instead of full data loads:

  • Modify Your Query: Use fields like LastModifiedDate to pull only updated records. Example:

     
    SELECT *
    FROM YourTable
    WHERE LastModifiedDate > LAST_RUN_DATE
  • Enable Incremental Loading: Adjust the BigQuery Data Transfer configuration to fetch only incremental data.

4. Debug Connector Errors

The "ConnectorStatusException" and "Failed to read next record" errors could stem from issues with Salesforce objects or metadata:

  • Custom vs. Standard Objects:
    Verify that all custom fields and objects are accessible via the Salesforce API, and ensure you have sufficient permissions.

  • Metadata Updates:
    If new fields or objects were added in Salesforce recently, refresh the metadata in the Data Transfer Service to ensure everything is up-to-date.

5. Analyze BigQuery Logs

If the logs you’ve reviewed aren’t detailed enough:

  • Enable verbose logging for the data transfer job.
  • Check execution logs in Cloud Logging for additional error details.

6. Address Retriable Errors

The "UNABLE_TO_RETRY" error typically occurs when retry attempts are exhausted or a payload is stuck mid-process.

  • Manual Retry: Clone the job and rerun it manually to see if it succeeds.
  • Contact Google Support: If the problem persists, share your error logs with Google Cloud Support for further investigation—it may be an internal bug.

7. Consider Third-Party Tools

If the native Salesforce to BigQuery connector continues to fail, tools like Windsor.ai might be a more robust option. These platforms often provide advanced error handling and greater customization for data pipelines.

Next Steps

  • Test with a smaller subset of data from the problematic tables.
  • Verify API limits and ensure the correct permissions are set for all fields.
  • If issues persist, gather detailed logs and share them with Google Support for deeper analysis.

Hey @Snoshone07 thanks for your answer! I retested the integration and now it is working properly!
Do you know if it is possible to have incremental load also with Biqguery Data Transfer? I scheduled my transfer daily but checking the rows ingested and the duration, it is loading everytime the full object