DataFlow job converted into pipeline scheduled not running

Dear All,

I'm quite new to DataFlow, I've build a job to read data from SQL Server and write into BigQuery. When I created the job and run it I could query the data in BigQuery, but when importing the job as pipeline and set to run daily at a specific time I get the following error:

jsonPayload{
@type"type.googleapis.com/google.cloud.scheduler.logging.AttemptFinished"
jobName"projects/myproject/locations/southamerica-east1/jobs/datapipelines-primaviacargaerp"
status"INVALID_ARGUMENT"
targetType"HTTP"
url"https://datapipelines.googleapis.com/v1/projects/myproject/locations/southamerica-east1/pipelines/primaviacargaerp:run"
}
logName"projects/myproject/logs/cloudscheduler.googleapis.com%2Fexecutions"
receiveTimestamp"2023-08-16T01:29:00.603723407Z"
resource{2}
severity"ERROR"
timestamp"2023-08-16T01:29:00.603723407Z"

Just for testing, in the cloud scheduler I've taken out the ":run" from the URL, so the status became "not found" instead of "invalid argument". I really don't know what to do to have it scheduled and run every day.

1 13 2,204
13 REPLIES 13

The error message you're receiving suggests that the Cloud Scheduler job is attempting to invoke the run method on a Dataflow pipeline that might not exist or is incorrectly referenced.

Here are steps to troubleshoot:

  1. Pipeline Existence:

    • Ensure the Dataflow pipeline you're trying to run is indeed created. Navigate to the Dataflow console and check under the "Pipelines" tab.
  2. Pipeline Name Verification:

    • Confirm the name of the pipeline specified in the Cloud Scheduler job. It should follow the format projects/<project_id>/locations/<location>/pipelines/<pipeline_name>.
  3. Cloud Scheduler Logs:

    • Examine the logs for the Cloud Scheduler job. These logs can offer more detailed insights into the exact nature of the error.
  4. Additional Considerations:

    • Ensure the pipeline isn't utilizing features incompatible with Cloud Scheduler.
    • Check if the pipeline uses a temporary location that might have expired.
    • Confirm that the pipeline's data source is accessible and hasn't changed.
  5. Documentation:

I apreciate your answer @ms4446 

1. yes the pipeline exists

2. The name follows the format. By the way when you say the name, is it in the URL?

3. The log message is the one I've put in the topic

4. It will take time but I'll try to manage checking if the pipeline uses incompatible features with Cloud Scheduler (I have no idea how I'll discover it). I'm using Cloud Console Interface to schedule the pipeline execution. The pipeline doesn't use a temporary location. The source is a SQL Server database, I connect normaly and when I run the Data Flow Job it reads and records the data. So from the DataFlow job I've created Data Flow pipeline by "import as pipeline" and from pipeline settings I schedule it to run.

Thank you for the additional details. Let's address each of your points:

  1. Pipeline Existence:

    • Since the pipeline exists, we can rule out issues related to its creation.
  2. Pipeline Name in URL:

    • Yes, when I refer to the name, it's the identifier in the URL that Cloud Scheduler uses to trigger the pipeline. The URL in the Cloud Scheduler job should point to the correct endpoint to initiate the Dataflow pipeline. The format projects/<project_id>/locations/<location>/pipelines/<pipeline_name> is a general representation of how Google Cloud resources are often identified. In your case, the URL you provided (https://datapipelines.googleapis.com/v1/projects/myproject/locations/southamerica-east1/pipelines/pr...) seems to follow the correct format.
  3. Log Message:

    • The log message indicates an INVALID_ARGUMENT error. This typically means that the request made to the Dataflow API has some incorrect parameters. Since removing :run changed the error to not found, it suggests that the :run endpoint might be expecting some additional parameters.
  4. Pipeline Compatibility & Execution:

    • If you're using the Cloud Console interface to schedule the pipeline execution and everything works manually (i.e., when you run the Dataflow job directly), then the issue likely lies in how the Cloud Scheduler job is configured to trigger the pipeline.
    • The fact that the manual execution works suggests that the pipeline itself, its source, and its destination are all correctly configured.

Given the above, here are some further steps to consider:

  • HTTP Headers & Body: Ensure that the Cloud Scheduler job is set to make a POST request. Additionally, check if the Dataflow API expects any specific headers or a request body when triggering the pipeline. Sometimes, certain parameters or configurations might be required in the request body.

  • OAuth Scopes: Ensure that the Cloud Scheduler job has the correct OAuth scopes set up to allow it to trigger Dataflow jobs.

  • Recreate the Scheduler Job: Sometimes, there might be underlying issues that aren't immediately apparent. Consider deleting and recreating the Cloud Scheduler job to see if that resolves the issue.

Lastly, if you've made any changes to the Dataflow job after creating the pipeline, ensure that those changes are reflected in the pipeline version that the Cloud Scheduler job is trying to trigger.

Hi @ms4446 

Thank you again for your time on helping me.

about the INVALID_ARGUMENT you have written that the endpoint :run might be expecting some additional parameters. I try to look into it, but do you have any idea where I can found it? The possible missing parameters. 

HTTP Headers & Body: Is set as POST and I will try to go deeper into the DataFlow API.

OAUTG is set like this

RonaldoSales_0-1692304408925.png

And about recreating the Scheduler Job, I've done so many times, but always doing it using the console interface from "+ IMPORT AS PIPELINE" and I just fill the schedule parameters when creating the pipeline

The error message you are getting suggests that there might be an authentication issue when the Cloud Scheduler job tries to invoke the :run endpoint on the Dataflow pipeline.

Here are some things you can try to troubleshoot the issue:

  • Check the service account permissions. Make sure that the service account associated with the Cloud Scheduler job has the necessary permissions to access the Dataflow API. It should have roles like roles/dataflow.developer and roles/dataflow.runner.
  • Verify the OAuth token validity. Confirm that the OAuth token being used by the Cloud Scheduler job is valid and has not expired.
  • Check the Cloud Scheduler logs. Dive deeper into the logs for the Cloud Scheduler job. These logs can provide more detailed insights into the exact nature of the error.
  • Create a new Cloud Scheduler job using a different service account. Consider creating a new Cloud Scheduler job using a different service account to see if the issue persists.
  • Generate a fresh OAuth token and use it for the Cloud Scheduler job. Generate a fresh OAuth token and use it for the Cloud Scheduler job.
  • Refer to the documentation. Familiarize yourself with the Cloud Scheduler documentation, Dataflow documentation, and OAuth 2.0 documentation for more insights.

Thank you again @ms4446 

I'll go on with the investigation. Just for testing I've created a simple job, using avaiable template, reading a BigQuery table and exporting to a parquet file on cloud storage. Created a new service account and give the roles: Data Flow Developer and Data flow Administrator (I didn't find Runner), so I got a permission denied error. Then I gave access to cloud storage and got again the "INVALID_ARGUMENT"

So based on your suggestions I'll explore a little bit more before asking for more help.

The INVALID_ARGUMENT error can be a bit generic, but given the context, it's likely related to the configuration or the request being made.

Here are a few additional things to consider:

  1. Service Account Roles:

    • In addition to Dataflow Developer and Dataflow Administrator, ensure the service account has Storage Object Creator, Storage Object Viewer, and BigQuery User roles if you're reading from BigQuery and writing to Cloud Storage.
  2. Template Parameters:

    • Ensure that when you're using a template, all required parameters are provided. Even if the template is a pre-built one, it might require specific parameters to be passed.
  3. OAuth Scopes:

    • Ensure that the OAuth scope for the Cloud Scheduler job includes both Dataflow and the other services you're interacting with (BigQuery, Cloud Storage).
  4. Manual Execution:

    • Try executing the Dataflow job manually without using Cloud Scheduler. This can help determine if the issue is with the job itself or the way it's being triggered.
  5. Logs:

    • Dive deeper into the logs. Both Cloud Scheduler and Dataflow provide detailed logs that can offer more insights into the exact nature of the error. Look for any additional error messages or warnings that might give clues.
  6. Service Account JSON Key:

    • If you're using a service account JSON key for authentication, ensure it's correctly set up and hasn't been revoked or expired.

Remember, Google Cloud Platform has many interdependencies, and sometimes permissions or configurations in one service can affect another. It's a process of elimination to pinpoint the exact cause.

@ms4446  yesterday I was making some tests. I've found that when not using, from Data Flow Job screeen, "+ import as pipeline" the INVALID_ARGUMENT doesn't happen. So I've created my pipeline from Data Flow Pipeline Screeen and "+ CREATE DATA PIPELINE", but still using a template avaiable, so there is a bug in the functionallity "+ import as pipeline"

So the error now has changed and I'm trying to troubleshoot it

ERROR 2023-08-20T16:24:35.090943627Z Workflow failed. Causes: S09:Read-from-JdbcIO-Create-Values-Read-CreateSource--ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-Bounde/ProcessElementAndRestrictionWithSizing+Read from JdbcIO/JdbcIO.ReadAll/ParDo(Read)/ParMultiDo(Read)+Read from JdbcIO/JdbcIO.ReadAll/JdbcIO.Reparallelize/Consume/ParDo(Anonymous)/ParMultiDo(Anonymous)+Read from JdbcIO/JdbcIO.ReadAll/JdbcIO.Reparallelize/View.AsIterable/MapElements/Map/ParMultiDo(Anonymous) failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. If the logs only contain generic timeout errors related to accessing external resources, such as MongoDB, verify that the worker service account has permission to access the resource's subnetwork. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: Root cause: org.apache.beam.sdk.util.UserCodeException: java.sql.SQLException: Cannot create PoolableConnectionFactory (Falha de logon do usuário 'wk_dac'. ClientConnectionId:91341482-d533-4097-aef4-e5b88dfe5d4f) at org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:39) at org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn$DoFnInvoker.invokeProcessElement(Unknown Source) at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799) at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325) at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252) at org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788) at org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142) at org.apache.beam.fn.harness.FnApiDoFnRunner$WindowObservingProcessBundleContext.outputWithTimestamp(FnApiDoFnRunner.java:2214) at org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn.processElement(Read.java:320) at org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn$DoFnInvoker.invokeProcessElement(Unknown Source) at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1096) at org.apache.beam.fn.harness.FnApiDoFnRunner.access$1500(FnApiDoFnRunner.java:142) at org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:656) at org.apache.beam.fn.harness.…
{
"textPayload": "Workflow failed. Causes: S09:Read-from-JdbcIO-Create-Values-Read-CreateSource--ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-Bounde/ProcessElementAndRestrictionWithSizing+Read from JdbcIO/JdbcIO.ReadAll/ParDo(Read)/ParMultiDo(Read)+Read from JdbcIO/JdbcIO.ReadAll/JdbcIO.Reparallelize/Consume/ParDo(Anonymous)/ParMultiDo(Anonymous)+Read from JdbcIO/JdbcIO.ReadAll/JdbcIO.Reparallelize/View.AsIterable/MapElements/Map/ParMultiDo(Anonymous) failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. If the logs only contain generic timeout errors related to accessing external resources, such as MongoDB, verify that the worker service account has permission to access the resource's subnetwork. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: \n\n Root cause: org.apache.beam.sdk.util.UserCodeException: java.sql.SQLException: Cannot create PoolableConnectionFactory (Falha de logon do usuário 'wk_dac'. ClientConnectionId:91341482-d533-4097-aef4-e5b88dfe5d4f)\n\tat org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:39)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$WindowObservingProcessBundleContext.outputWithTimestamp(FnApiDoFnRunner.java:2214)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn.processElement(Read.java:320)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1096)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$1500(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:656)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:651)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:213)\n\tat org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver.multiplexElements(BeamFnDataInboundObserver.java:158)\n\tat org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:537)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:150)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:115)\n\tat java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)\n\tat java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)\n\tat org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)\n\tat java.base/java.lang.Thread.run(Thread.java:829)\nCaused by: java.sql.SQLException: Cannot create PoolableConnectionFactory (Falha de logon do usuário 'wk_dac'. ClientConnectionId:91341482-d533-4097-aef4-e5b88dfe5d4f)\n\tat org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:653)\n\tat org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:531)\n\tat org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:731)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.getConnection(JdbcIO.java:1503)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.processElement(JdbcIO.java:1516)\nCaused by: com.microsoft.sqlserver.jdbc.SQLServerException: Falha de logon do usuário 'wk_dac'. ClientConnectionId:91341482-d533-4097-aef4-e5b88dfe5d4f\n\tat com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:259)\n\tat com.microsoft.sqlserver.jdbc.TDSTokenHandler.onEOF(tdsparser.java:256)\n\tat com.microsoft.sqlserver.jdbc.TDSParser.parse(tdsparser.java:108)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.sendLogon(SQLServerConnection.java:4548)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.logon(SQLServerConnection.java:3409)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.access$100(SQLServerConnection.java:85)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection$LogonCommand.doExecute(SQLServerConnection.java:3373)\n\tat com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:7344)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:2713)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.connectHelper(SQLServerConnection.java:2261)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.login(SQLServerConnection.java:1921)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.connectInternal(SQLServerConnection.java:1762)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.connect(SQLServerConnection.java:1077)\n\tat com.microsoft.sqlserver.jdbc.SQLServerDriver.connect(SQLServerDriver.java:623)\n\tat org.apache.commons.dbcp2.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:52)\n\tat org.apache.commons.dbcp2.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:374)\n\tat org.apache.commons.dbcp2.BasicDataSource.validateConnectionFactory(BasicDataSource.java:106)\n\tat org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:649)\n\tat org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:531)\n\tat org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:731)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.getConnection(JdbcIO.java:1503)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.processElement(JdbcIO.java:1516)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$WindowObservingProcessBundleContext.outputWithTimestamp(FnApiDoFnRunner.java:2214)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn.processElement(Read.java:320)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1096)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$1500(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:656)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:651)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:213)\n\tat org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver.multiplexElements(BeamFnDataInboundObserver.java:158)\n\tat org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:537)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:150)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:115)\n\tat java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)\n\tat java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)\n\tat org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)\n\tat java.base/java.lang.Thread.run(Thread.java:829)\n\n Worker ID: carga-erp-primavia-mp--16-08200921-2cn6-harness-5vzv,\n\n Root cause: org.apache.beam.sdk.util.UserCodeException: java.sql.SQLException: Cannot create PoolableConnectionFactory (Falha de logon do usuário 'wk_dac'. ClientConnectionId:ed38b03f-8282-45fd-818e-57a5be40b000)\n\tat org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:39)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$WindowObservingProcessBundleContext.outputWithTimestamp(FnApiDoFnRunner.java:2214)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn.processElement(Read.java:320)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1096)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$1500(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:656)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:651)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:213)\n\tat org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver.multiplexElements(BeamFnDataInboundObserver.java:158)\n\tat org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:537)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:150)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:115)\n\tat java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)\n\tat java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)\n\tat org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)\n\tat java.base/java.lang.Thread.run(Thread.java:829)\nCaused by: java.sql.SQLException: Cannot create PoolableConnectionFactory (Falha de logon do usuário 'wk_dac'. ClientConnectionId:ed38b03f-8282-45fd-818e-57a5be40b000)\n\tat org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:653)\n\tat org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:531)\n\tat org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:731)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.getConnection(JdbcIO.java:1503)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.processElement(JdbcIO.java:1516)\nCaused by: com.microsoft.sqlserver.jdbc.SQLServerException: Falha de logon do usuário 'wk_dac'. ClientConnectionId:ed38b03f-8282-45fd-818e-57a5be40b000\n\tat com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:259)\n\tat com.microsoft.sqlserver.jdbc.TDSTokenHandler.onEOF(tdsparser.java:256)\n\tat com.microsoft.sqlserver.jdbc.TDSParser.parse(tdsparser.java:108)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.sendLogon(SQLServerConnection.java:4548)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.logon(SQLServerConnection.java:3409)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.access$100(SQLServerConnection.java:85)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection$LogonCommand.doExecute(SQLServerConnection.java:3373)\n\tat com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:7344)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:2713)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.connectHelper(SQLServerConnection.java:2261)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.login(SQLServerConnection.java:1921)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.connectInternal(SQLServerConnection.java:1762)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.connect(SQLServerConnection.java:1077)\n\tat com.microsoft.sqlserver.jdbc.SQLServerDriver.connect(SQLServerDriver.java:623)\n\tat org.apache.commons.dbcp2.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:52)\n\tat org.apache.commons.dbcp2.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:374)\n\tat org.apache.commons.dbcp2.BasicDataSource.validateConnectionFactory(BasicDataSource.java:106)\n\tat org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:649)\n\tat org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:531)\n\tat org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:731)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.getConnection(JdbcIO.java:1503)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.processElement(JdbcIO.java:1516)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$WindowObservingProcessBundleContext.outputWithTimestamp(FnApiDoFnRunner.java:2214)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn.processElement(Read.java:320)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1096)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$1500(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:656)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:651)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:213)\n\tat org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver.multiplexElements(BeamFnDataInboundObserver.java:158)\n\tat org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:537)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:150)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:115)\n\tat java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)\n\tat java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)\n\tat org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)\n\tat java.base/java.lang.Thread.run(Thread.java:829)\n\n Worker ID: carga-erp-primavia-mp--16-08200921-2cn6-harness-5vzv,\n\n Root cause: org.apache.beam.sdk.util.UserCodeException: java.sql.SQLException: Cannot create PoolableConnectionFactory (Falha de logon do usuário 'wk_dac'. ClientConnectionId:3fa24171-8a4a-4511-be71-d6eb068bedd4)\n\tat org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:39)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$WindowObservingProcessBundleContext.outputWithTimestamp(FnApiDoFnRunner.java:2214)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn.processElement(Read.java:320)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1096)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$1500(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:656)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:651)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:213)\n\tat org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver.multiplexElements(BeamFnDataInboundObserver.java:158)\n\tat org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:537)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:150)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:115)\n\tat java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)\n\tat java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)\n\tat org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)\n\tat java.base/java.lang.Thread.run(Thread.java:829)\nCaused by: java.sql.SQLException: Cannot create PoolableConnectionFactory (Falha de logon do usuário 'wk_dac'. ClientConnectionId:3fa24171-8a4a-4511-be71-d6eb068bedd4)\n\tat org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:653)\n\tat org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:531)\n\tat org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:731)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.getConnection(JdbcIO.java:1503)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.processElement(JdbcIO.java:1516)\nCaused by: com.microsoft.sqlserver.jdbc.SQLServerException: Falha de logon do usuário 'wk_dac'. ClientConnectionId:3fa24171-8a4a-4511-be71-d6eb068bedd4\n\tat com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:259)\n\tat com.microsoft.sqlserver.jdbc.TDSTokenHandler.onEOF(tdsparser.java:256)\n\tat com.microsoft.sqlserver.jdbc.TDSParser.parse(tdsparser.java:108)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.sendLogon(SQLServerConnection.java:4548)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.logon(SQLServerConnection.java:3409)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.access$100(SQLServerConnection.java:85)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection$LogonCommand.doExecute(SQLServerConnection.java:3373)\n\tat com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:7344)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:2713)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.connectHelper(SQLServerConnection.java:2261)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.login(SQLServerConnection.java:1921)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.connectInternal(SQLServerConnection.java:1762)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.connect(SQLServerConnection.java:1077)\n\tat com.microsoft.sqlserver.jdbc.SQLServerDriver.connect(SQLServerDriver.java:623)\n\tat org.apache.commons.dbcp2.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:52)\n\tat org.apache.commons.dbcp2.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:374)\n\tat org.apache.commons.dbcp2.BasicDataSource.validateConnectionFactory(BasicDataSource.java:106)\n\tat org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:649)\n\tat org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:531)\n\tat org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:731)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.getConnection(JdbcIO.java:1503)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.processElement(JdbcIO.java:1516)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$WindowObservingProcessBundleContext.outputWithTimestamp(FnApiDoFnRunner.java:2214)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn.processElement(Read.java:320)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1096)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$1500(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:656)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:651)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:213)\n\tat org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver.multiplexElements(BeamFnDataInboundObserver.java:158)\n\tat org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:537)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:150)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:115)\n\tat java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)\n\tat java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)\n\tat org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)\n\tat java.base/java.lang.Thread.run(Thread.java:829)\n\n Worker ID: carga-erp-primavia-mp--16-08200921-2cn6-harness-5vzv,\n\n Root cause: org.apache.beam.sdk.util.UserCodeException: java.sql.SQLException: Cannot create PoolableConnectionFactory (Falha de logon do usuário 'wk_dac'. ClientConnectionId:b25c3fe0-0002-420f-9b54-67b09508c44c)\n\tat org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:39)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$WindowObservingProcessBundleContext.outputWithTimestamp(FnApiDoFnRunner.java:2214)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn.processElement(Read.java:320)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1096)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$1500(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:656)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:651)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:213)\n\tat org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver.multiplexElements(BeamFnDataInboundObserver.java:158)\n\tat org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:537)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:150)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:115)\n\tat java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)\n\tat java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)\n\tat org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)\n\tat java.base/java.lang.Thread.run(Thread.java:829)\nCaused by: java.sql.SQLException: Cannot create PoolableConnectionFactory (Falha de logon do usuário 'wk_dac'. ClientConnectionId:b25c3fe0-0002-420f-9b54-67b09508c44c)\n\tat org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:653)\n\tat org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:531)\n\tat org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:731)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.getConnection(JdbcIO.java:1503)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.processElement(JdbcIO.java:1516)\nCaused by: com.microsoft.sqlserver.jdbc.SQLServerException: Falha de logon do usuário 'wk_dac'. ClientConnectionId:b25c3fe0-0002-420f-9b54-67b09508c44c\n\tat com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:259)\n\tat com.microsoft.sqlserver.jdbc.TDSTokenHandler.onEOF(tdsparser.java:256)\n\tat com.microsoft.sqlserver.jdbc.TDSParser.parse(tdsparser.java:108)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.sendLogon(SQLServerConnection.java:4548)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.logon(SQLServerConnection.java:3409)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.access$100(SQLServerConnection.java:85)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection$LogonCommand.doExecute(SQLServerConnection.java:3373)\n\tat com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:7344)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:2713)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.connectHelper(SQLServerConnection.java:2261)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.login(SQLServerConnection.java:1921)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.connectInternal(SQLServerConnection.java:1762)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.connect(SQLServerConnection.java:1077)\n\tat com.microsoft.sqlserver.jdbc.SQLServerDriver.connect(SQLServerDriver.java:623)\n\tat org.apache.commons.dbcp2.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:52)\n\tat org.apache.commons.dbcp2.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:374)\n\tat org.apache.commons.dbcp2.BasicDataSource.validateConnectionFactory(BasicDataSource.java:106)\n\tat org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:649)\n\tat org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:531)\n\tat org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:731)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.getConnection(JdbcIO.java:1503)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.processElement(JdbcIO.java:1516)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$WindowObservingProcessBundleContext.outputWithTimestamp(FnApiDoFnRunner.java:2214)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn.processElement(Read.java:320)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1096)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$1500(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:656)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:651)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:213)\n\tat org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver.multiplexElements(BeamFnDataInboundObserver.java:158)\n\tat org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:537)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:150)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:115)\n\tat java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)\n\tat java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)\n\tat org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)\n\tat java.base/java.lang.Thread.run(Thread.java:829)\n\n Worker ID: carga-erp-primavia-mp--16-08200921-2cn6-harness-5vzv",
"insertId": "8lvhbvch4h",
"resource": {
"type": "dataflow_step",
"labels": {
"job_id": "2023-08-20_09_20_00-17119339413247852494",
"job_name": "carga-erp-primavia-mp--1692548400-58782424111294090",
"region": "southamerica-east1",
"step_id": "",
"project_id": "167979788553"
}
},
"timestamp": "2023-08-20T16:24:35.090943627Z",
"severity": "ERROR",
"labels": {
"dataflow.googleapis.com/region": "southamerica-east1",
"dataflow.googleapis.com/job_name": "carga-erp-primavia-mp--1692548400-58782424111294090",
"dataflow.googleapis.com/log_type": "system",
"dataflow.googleapis.com/job_id": "2023-08-20_09_20_00-17119339413247852494"
},
"logName": "projects/dac-dados/logs/dataflow.googleapis.com%2Fjob-message",
"receiveTimestamp": "2023-08-20T16:24:36.742832022Z"
}

The error message you're encountering indicates an issue with the JDBC connection to your database. Specifically, the error "Cannot create PoolableConnectionFactory (Falha de logon do usuário 'wk_dac'. ClientConnectionId:91341482-d533-4097-aef4-e5b88dfe5d4f)" suggests that the user "wk_dac" faced a login failure.

Possible reasons for this include:

  1. Incorrect username or password.
  2. Insufficient permissions for the user "wk_dac" to access the database.
  3. The database might be temporarily unavailable or facing connectivity issues.

To address this:

  • Double-check the username and password to ensure they're accurate.
  • Confirm that "wk_dac" has the required permissions in the database.
  • Verify the database's status and ensure it's operational.

Further steps to consider:

  • Ensure the JDBC driver used is compatible, correctly installed, and configured.
  • Confirm that the database is not only running but also accessible from the Dataflow workers. This might involve checking network configurations, firewalls, or IP whitelists.
  • Test with an alternative username and password, if possible.
  • Consider running the pipeline locally or in a different environment to determine if the issue is specific to the current setup.

@ms4446  thank for addressing the troubleshooting. But look the cenario:

The Job runs with no error. 

If I create the pipeline using the "+ IMPORT AS PIPELINE" from the jobs screen, using the same job I just run, I got the error INVALID_ARGUMENT.

If I create the pipeline from pipeline screen using the "CREATE DATA PIPELINE" I got the error "Cannot create PoolableConnectionFactory (Falha de logon do usuário 'wk_dac'. ClientConnectionId:91341482-d533-4097-aef4-e5b88dfe5d4f)"

The user wk_dac has access to the database because the job runs and the data is store in BigQuery.

To create the job or the pipeline I use a google template "SQL Server to Bigquery"

We are trying to use a different JDBC, but what makes me confuse is that just the job I can run with no problem.

Thank you again for your time and suggestions

Based on the details you've shared, it appears that the discrepancies arise from the method used to create the pipeline.

  1. Using "Import as Pipeline" from the Jobs screen:

    • This method creates a new pipeline based on an existing job. However, it might not retain all the configuration details of the original job, potentially leading to the INVALID_ARGUMENT error. There could be a bug or a missing configuration that's causing this issue.
  2. Using "Create Data Pipeline" from the Pipelines screen:

    • This approach crafts a new pipeline from scratch, requiring you to input all the configuration details manually. The Cannot create PoolableConnectionFactory error suggests there might be a discrepancy in the JDBC connection details or other configurations.

To address these issues:

  • Re-import the Pipeline: Try importing the pipeline from the Jobs screen once more. Ensure that you meticulously review and specify all the configuration details from the original job.

  • Re-create the Pipeline: When using the "Create Data Pipeline" option from the Pipelines screen, double-check all configurations, especially those related to database connections, to ensure they match the successful job's settings.

If, after these steps, the issue persists, reaching out to Google Cloud Support might be the best course of action. They could provide insights into any known issues or specific configurations required for the template you're using.

@ms4446  I was still troubleshooting, and to workaround it I was running the job every day on my own, but yesterday when running the job and today too, I got new error

"org.apache.beam.sdk.util.UserCodeException: java.sql.SQLException: Cannot create PoolableConnectionFactory (The driver could not establish a secure connection to SQL Server by using Secure Sockets Layer (SSL) encryption. Error: "PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target". ClientConnectionId:e5bc7539-1271-4f62-8801-c9d7c129e35d)

I didin't find any job config to setup any certificate. Do you have any idea?

Hi @RonaldoSales,

The error message you're seeing is related to SSL encryption when trying to establish a connectio

n to the SQL Server. The specific error "PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target" indicates that the Java application (in this case, the Dataflow job) cannot validate or trust the certificate presented by the SQL Server.

Here are some steps to troubleshoot and potentially resolve the issue:

  1. Trust the Certificate:

    • If your SQL Server is using a self-signed certificate or a certificate from an internal certificate authority (CA), you'll need to ensure that the certificate is added to the Java Keystore of the environment where your Dataflow job runs. This will allow Java to trust the certificate when establishing an SSL connection.
  2. Update JDBC Driver:

    • Ensure you're using the latest JDBC driver compatible with your SQL Server version. Sometimes, older drivers might have issues with certain SSL configurations.
  3. Disable SSL (Not Recommended for Production):

    • As a temporary measure for testing purposes, you can disable SSL encryption in the JDBC connection string. This is typically done by adding encrypt=false to the connection string. However, this is not recommended for production environments as it exposes your data to potential eavesdropping.
  4. Check SQL Server Configuration:

    • Ensure that the SQL Server is correctly configured for SSL encryption. If there have been any recent changes to the server's SSL settings or certificates, it might be causing this issue.
  5. Java Version:

    • Check the version of Java being used by the Dataflow job. Older versions of Java might not support certain SSL/TLS protocols or might have issues with certificate validation. Consider updating to a newer version if applicable.
  6. Custom Dataflow Image:

    • If you have the capability, you can create a custom Dataflow worker image that includes the necessary trusted certificates. This way, when Dataflow spins up workers based on this image, they will already trust the SQL Server's certificate.

Remember, while disabling SSL can be a quick way to diagnose if SSL is the root cause, it's crucial to ensure all connections are encrypted, especially in production environments, to maintain data security and integrity.