I have recently encountered the following errors:
Transfer run failed due to internal errors. Please try again later.
or I got also:
UNABLE_TO_RETRY: A retriable error could not be retried because a stream payload was received from the server (see go/xs-unable-to-retry-message-received) (old status: INTERNAL: Exception was thrown by the Connector implementation: Failed to read next record : Failed to query results com.google.cloud.hosted.integrationservice.common.ConnectorStatusException
We tested the transfer for multiple tables have, many of them have succeed, but someone give always the same error. However, the configuration is the same for all of these tables and there is no difference from a custom or a standard salesforce object.
Do you have some hints regarding these errors?
Thanks in advance!
The salesforce API has also been configured properly as it works through the other platform.
Unfortuntately, there's no specific error cause in the bigquery data transfer log.
Hi @DariotraSec - since the BigQuery Data Transfer Service for Salesforce is still in Preview, you can contact dts-preview-support@google.com for help.
Hi @DariotraSec it looks like you're facing two key challenges with your Salesforce to BigQuery data transfer:
These problems can result from a variety of causes, such as API limits, mismatched configurations, data size, or even subtle differences in table structures or metadata. Let’s work through the solutions step by step:
1. Check API Limits and Quotas
Salesforce imposes limits on API calls, and exceeding these can disrupt data transfers. Here’s how to check:
Monitor Your API Usage:
Go to Setup > System Overview in Salesforce or use the REST API:
Solution for Limits:
If you're nearing the API cap:
2. Investigate Table-Specific Differences
Even if the configuration appears identical for all tables, hidden discrepancies can cause issues:
Field-Level Data Issues:
Some fields may contain nulls, special characters, or exceed BigQuery’s limits (e.g., string length). Query the problematic table directly in Salesforce and inspect its data.
Data Volume:
Large tables may experience timeouts or slow performance. Use incremental transfers to fetch data in smaller chunks.
3. Use Incremental Loading
For large tables, switch to incremental loading instead of full data loads:
Modify Your Query: Use fields like LastModifiedDate to pull only updated records. Example:
Enable Incremental Loading: Adjust the BigQuery Data Transfer configuration to fetch only incremental data.
4. Debug Connector Errors
The "ConnectorStatusException" and "Failed to read next record" errors could stem from issues with Salesforce objects or metadata:
Custom vs. Standard Objects:
Verify that all custom fields and objects are accessible via the Salesforce API, and ensure you have sufficient permissions.
Metadata Updates:
If new fields or objects were added in Salesforce recently, refresh the metadata in the Data Transfer Service to ensure everything is up-to-date.
5. Analyze BigQuery Logs
If the logs you’ve reviewed aren’t detailed enough:
6. Address Retriable Errors
The "UNABLE_TO_RETRY" error typically occurs when retry attempts are exhausted or a payload is stuck mid-process.
7. Consider Third-Party Tools
If the native Salesforce to BigQuery connector continues to fail, tools like Windsor.ai might be a more robust option. These platforms often provide advanced error handling and greater customization for data pipelines.
Next Steps
Hey @Snoshone07 thanks for your answer! I retested the integration and now it is working properly!
Do you know if it is possible to have incremental load also with Biqguery Data Transfer? I scheduled my transfer daily but checking the rows ingested and the duration, it is loading everytime the full object