Dear All,
I'm quite new to DataFlow, I've build a job to read data from SQL Server and write into BigQuery. When I created the job and run it I could query the data in BigQuery, but when importing the job as pipeline and set to run daily at a specific time I get the following error:
Just for testing, in the cloud scheduler I've taken out the ":run" from the URL, so the status became "not found" instead of "invalid argument". I really don't know what to do to have it scheduled and run every day.
The error message you're receiving suggests that the Cloud Scheduler job is attempting to invoke the run
method on a Dataflow pipeline that might not exist or is incorrectly referenced.
Here are steps to troubleshoot:
Pipeline Existence:
Pipeline Name Verification:
projects/<project_id>/locations/<location>/pipelines/<pipeline_name>
.Cloud Scheduler Logs:
Additional Considerations:
Documentation:
I apreciate your answer @ms4446
1. yes the pipeline exists
2. The name follows the format. By the way when you say the name, is it in the URL?
3. The log message is the one I've put in the topic
4. It will take time but I'll try to manage checking if the pipeline uses incompatible features with Cloud Scheduler (I have no idea how I'll discover it). I'm using Cloud Console Interface to schedule the pipeline execution. The pipeline doesn't use a temporary location. The source is a SQL Server database, I connect normaly and when I run the Data Flow Job it reads and records the data. So from the DataFlow job I've created Data Flow pipeline by "import as pipeline" and from pipeline settings I schedule it to run.
Thank you for the additional details. Let's address each of your points:
Pipeline Existence:
Pipeline Name in URL:
projects/<project_id>/locations/<location>/pipelines/<pipeline_name>
is a general representation of how Google Cloud resources are often identified. In your case, the URL you provided (https://datapipelines.googleapis.com/v1/projects/myproject/locations/southamerica-east1/pipelines/pr...
) seems to follow the correct format.Log Message:
INVALID_ARGUMENT
error. This typically means that the request made to the Dataflow API has some incorrect parameters. Since removing :run
changed the error to not found
, it suggests that the :run
endpoint might be expecting some additional parameters.Pipeline Compatibility & Execution:
Given the above, here are some further steps to consider:
HTTP Headers & Body: Ensure that the Cloud Scheduler job is set to make a POST
request. Additionally, check if the Dataflow API expects any specific headers or a request body when triggering the pipeline. Sometimes, certain parameters or configurations might be required in the request body.
OAuth Scopes: Ensure that the Cloud Scheduler job has the correct OAuth scopes set up to allow it to trigger Dataflow jobs.
Recreate the Scheduler Job: Sometimes, there might be underlying issues that aren't immediately apparent. Consider deleting and recreating the Cloud Scheduler job to see if that resolves the issue.
Lastly, if you've made any changes to the Dataflow job after creating the pipeline, ensure that those changes are reflected in the pipeline version that the Cloud Scheduler job is trying to trigger.
Hi @ms4446
Thank you again for your time on helping me.
about the INVALID_ARGUMENT you have written that the endpoint :run might be expecting some additional parameters. I try to look into it, but do you have any idea where I can found it? The possible missing parameters.
HTTP Headers & Body: Is set as POST and I will try to go deeper into the DataFlow API.
OAUTG is set like this
And about recreating the Scheduler Job, I've done so many times, but always doing it using the console interface from "+ IMPORT AS PIPELINE" and I just fill the schedule parameters when creating the pipeline
The error message you are getting suggests that there might be an authentication issue when the Cloud Scheduler job tries to invoke the :run
endpoint on the Dataflow pipeline.
Here are some things you can try to troubleshoot the issue:
roles/dataflow.developer
and roles/dataflow.runner
.Thank you again @ms4446
I'll go on with the investigation. Just for testing I've created a simple job, using avaiable template, reading a BigQuery table and exporting to a parquet file on cloud storage. Created a new service account and give the roles: Data Flow Developer and Data flow Administrator (I didn't find Runner), so I got a permission denied error. Then I gave access to cloud storage and got again the "INVALID_ARGUMENT"
So based on your suggestions I'll explore a little bit more before asking for more help.
The INVALID_ARGUMENT
error can be a bit generic, but given the context, it's likely related to the configuration or the request being made.
Here are a few additional things to consider:
Service Account Roles:
Dataflow Developer
and Dataflow Administrator
, ensure the service account has Storage Object Creator
, Storage Object Viewer
, and BigQuery User
roles if you're reading from BigQuery and writing to Cloud Storage.Template Parameters:
OAuth Scopes:
Manual Execution:
Logs:
Service Account JSON Key:
Remember, Google Cloud Platform has many interdependencies, and sometimes permissions or configurations in one service can affect another. It's a process of elimination to pinpoint the exact cause.
@ms4446 yesterday I was making some tests. I've found that when not using, from Data Flow Job screeen, "+ import as pipeline" the INVALID_ARGUMENT doesn't happen. So I've created my pipeline from Data Flow Pipeline Screeen and "+ CREATE DATA PIPELINE", but still using a template avaiable, so there is a bug in the functionallity "+ import as pipeline"
So the error now has changed and I'm trying to troubleshoot it
ERROR 2023-08-20T16:24:35.090943627Z Workflow failed. Causes: S09:Read-from-JdbcIO-Create-Values-Read-CreateSource--ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-Bounde/ProcessElementAndRestrictionWithSizing+Read from JdbcIO/JdbcIO.ReadAll/ParDo(Read)/ParMultiDo(Read)+Read from JdbcIO/JdbcIO.ReadAll/JdbcIO.Reparallelize/Consume/ParDo(Anonymous)/ParMultiDo(Anonymous)+Read from JdbcIO/JdbcIO.ReadAll/JdbcIO.Reparallelize/View.AsIterable/MapElements/Map/ParMultiDo(Anonymous) failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. If the logs only contain generic timeout errors related to accessing external resources, such as MongoDB, verify that the worker service account has permission to access the resource's subnetwork. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: Root cause: org.apache.beam.sdk.util.UserCodeException: java.sql.SQLException: Cannot create PoolableConnectionFactory (Falha de logon do usuário 'wk_dac'. ClientConnectionId:91341482-d533-4097-aef4-e5b88dfe5d4f) at org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:39) at org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn$DoFnInvoker.invokeProcessElement(Unknown Source) at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799) at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325) at org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252) at org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788) at org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142) at org.apache.beam.fn.harness.FnApiDoFnRunner$WindowObservingProcessBundleContext.outputWithTimestamp(FnApiDoFnRunner.java:2214) at org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn.processElement(Read.java:320) at org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn$DoFnInvoker.invokeProcessElement(Unknown Source) at org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1096) at org.apache.beam.fn.harness.FnApiDoFnRunner.access$1500(FnApiDoFnRunner.java:142) at org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:656) at org.apache.beam.fn.harness.…
{
"textPayload": "Workflow failed. Causes: S09:Read-from-JdbcIO-Create-Values-Read-CreateSource--ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-Bounde/ProcessElementAndRestrictionWithSizing+Read from JdbcIO/JdbcIO.ReadAll/ParDo(Read)/ParMultiDo(Read)+Read from JdbcIO/JdbcIO.ReadAll/JdbcIO.Reparallelize/Consume/ParDo(Anonymous)/ParMultiDo(Anonymous)+Read from JdbcIO/JdbcIO.ReadAll/JdbcIO.Reparallelize/View.AsIterable/MapElements/Map/ParMultiDo(Anonymous) failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. If the logs only contain generic timeout errors related to accessing external resources, such as MongoDB, verify that the worker service account has permission to access the resource's subnetwork. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: \n\n Root cause: org.apache.beam.sdk.util.UserCodeException: java.sql.SQLException: Cannot create PoolableConnectionFactory (Falha de logon do usuário 'wk_dac'. ClientConnectionId:91341482-d533-4097-aef4-e5b88dfe5d4f)\n\tat org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:39)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$WindowObservingProcessBundleContext.outputWithTimestamp(FnApiDoFnRunner.java:2214)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn.processElement(Read.java:320)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1096)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$1500(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:656)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:651)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:213)\n\tat org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver.multiplexElements(BeamFnDataInboundObserver.java:158)\n\tat org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:537)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:150)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:115)\n\tat java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)\n\tat java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)\n\tat org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)\n\tat java.base/java.lang.Thread.run(Thread.java:829)\nCaused by: java.sql.SQLException: Cannot create PoolableConnectionFactory (Falha de logon do usuário 'wk_dac'. ClientConnectionId:91341482-d533-4097-aef4-e5b88dfe5d4f)\n\tat org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:653)\n\tat org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:531)\n\tat org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:731)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.getConnection(JdbcIO.java:1503)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.processElement(JdbcIO.java:1516)\nCaused by: com.microsoft.sqlserver.jdbc.SQLServerException: Falha de logon do usuário 'wk_dac'. ClientConnectionId:91341482-d533-4097-aef4-e5b88dfe5d4f\n\tat com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:259)\n\tat com.microsoft.sqlserver.jdbc.TDSTokenHandler.onEOF(tdsparser.java:256)\n\tat com.microsoft.sqlserver.jdbc.TDSParser.parse(tdsparser.java:108)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.sendLogon(SQLServerConnection.java:4548)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.logon(SQLServerConnection.java:3409)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.access$100(SQLServerConnection.java:85)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection$LogonCommand.doExecute(SQLServerConnection.java:3373)\n\tat com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:7344)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:2713)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.connectHelper(SQLServerConnection.java:2261)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.login(SQLServerConnection.java:1921)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.connectInternal(SQLServerConnection.java:1762)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.connect(SQLServerConnection.java:1077)\n\tat com.microsoft.sqlserver.jdbc.SQLServerDriver.connect(SQLServerDriver.java:623)\n\tat org.apache.commons.dbcp2.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:52)\n\tat org.apache.commons.dbcp2.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:374)\n\tat org.apache.commons.dbcp2.BasicDataSource.validateConnectionFactory(BasicDataSource.java:106)\n\tat org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:649)\n\tat org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:531)\n\tat org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:731)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.getConnection(JdbcIO.java:1503)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.processElement(JdbcIO.java:1516)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$WindowObservingProcessBundleContext.outputWithTimestamp(FnApiDoFnRunner.java:2214)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn.processElement(Read.java:320)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1096)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$1500(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:656)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:651)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:213)\n\tat org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver.multiplexElements(BeamFnDataInboundObserver.java:158)\n\tat org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:537)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:150)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:115)\n\tat java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)\n\tat java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)\n\tat org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)\n\tat java.base/java.lang.Thread.run(Thread.java:829)\n\n Worker ID: carga-erp-primavia-mp--16-08200921-2cn6-harness-5vzv,\n\n Root cause: org.apache.beam.sdk.util.UserCodeException: java.sql.SQLException: Cannot create PoolableConnectionFactory (Falha de logon do usuário 'wk_dac'. ClientConnectionId:ed38b03f-8282-45fd-818e-57a5be40b000)\n\tat org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:39)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$WindowObservingProcessBundleContext.outputWithTimestamp(FnApiDoFnRunner.java:2214)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn.processElement(Read.java:320)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1096)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$1500(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:656)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:651)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:213)\n\tat org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver.multiplexElements(BeamFnDataInboundObserver.java:158)\n\tat org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:537)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:150)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:115)\n\tat java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)\n\tat java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)\n\tat org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)\n\tat java.base/java.lang.Thread.run(Thread.java:829)\nCaused by: java.sql.SQLException: Cannot create PoolableConnectionFactory (Falha de logon do usuário 'wk_dac'. ClientConnectionId:ed38b03f-8282-45fd-818e-57a5be40b000)\n\tat org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:653)\n\tat org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:531)\n\tat org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:731)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.getConnection(JdbcIO.java:1503)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.processElement(JdbcIO.java:1516)\nCaused by: com.microsoft.sqlserver.jdbc.SQLServerException: Falha de logon do usuário 'wk_dac'. ClientConnectionId:ed38b03f-8282-45fd-818e-57a5be40b000\n\tat com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:259)\n\tat com.microsoft.sqlserver.jdbc.TDSTokenHandler.onEOF(tdsparser.java:256)\n\tat com.microsoft.sqlserver.jdbc.TDSParser.parse(tdsparser.java:108)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.sendLogon(SQLServerConnection.java:4548)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.logon(SQLServerConnection.java:3409)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.access$100(SQLServerConnection.java:85)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection$LogonCommand.doExecute(SQLServerConnection.java:3373)\n\tat com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:7344)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:2713)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.connectHelper(SQLServerConnection.java:2261)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.login(SQLServerConnection.java:1921)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.connectInternal(SQLServerConnection.java:1762)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.connect(SQLServerConnection.java:1077)\n\tat com.microsoft.sqlserver.jdbc.SQLServerDriver.connect(SQLServerDriver.java:623)\n\tat org.apache.commons.dbcp2.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:52)\n\tat org.apache.commons.dbcp2.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:374)\n\tat org.apache.commons.dbcp2.BasicDataSource.validateConnectionFactory(BasicDataSource.java:106)\n\tat org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:649)\n\tat org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:531)\n\tat org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:731)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.getConnection(JdbcIO.java:1503)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.processElement(JdbcIO.java:1516)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$WindowObservingProcessBundleContext.outputWithTimestamp(FnApiDoFnRunner.java:2214)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn.processElement(Read.java:320)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1096)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$1500(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:656)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:651)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:213)\n\tat org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver.multiplexElements(BeamFnDataInboundObserver.java:158)\n\tat org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:537)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:150)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:115)\n\tat java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)\n\tat java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)\n\tat org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)\n\tat java.base/java.lang.Thread.run(Thread.java:829)\n\n Worker ID: carga-erp-primavia-mp--16-08200921-2cn6-harness-5vzv,\n\n Root cause: org.apache.beam.sdk.util.UserCodeException: java.sql.SQLException: Cannot create PoolableConnectionFactory (Falha de logon do usuário 'wk_dac'. ClientConnectionId:3fa24171-8a4a-4511-be71-d6eb068bedd4)\n\tat org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:39)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$WindowObservingProcessBundleContext.outputWithTimestamp(FnApiDoFnRunner.java:2214)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn.processElement(Read.java:320)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1096)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$1500(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:656)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:651)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:213)\n\tat org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver.multiplexElements(BeamFnDataInboundObserver.java:158)\n\tat org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:537)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:150)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:115)\n\tat java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)\n\tat java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)\n\tat org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)\n\tat java.base/java.lang.Thread.run(Thread.java:829)\nCaused by: java.sql.SQLException: Cannot create PoolableConnectionFactory (Falha de logon do usuário 'wk_dac'. ClientConnectionId:3fa24171-8a4a-4511-be71-d6eb068bedd4)\n\tat org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:653)\n\tat org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:531)\n\tat org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:731)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.getConnection(JdbcIO.java:1503)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.processElement(JdbcIO.java:1516)\nCaused by: com.microsoft.sqlserver.jdbc.SQLServerException: Falha de logon do usuário 'wk_dac'. ClientConnectionId:3fa24171-8a4a-4511-be71-d6eb068bedd4\n\tat com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:259)\n\tat com.microsoft.sqlserver.jdbc.TDSTokenHandler.onEOF(tdsparser.java:256)\n\tat com.microsoft.sqlserver.jdbc.TDSParser.parse(tdsparser.java:108)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.sendLogon(SQLServerConnection.java:4548)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.logon(SQLServerConnection.java:3409)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.access$100(SQLServerConnection.java:85)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection$LogonCommand.doExecute(SQLServerConnection.java:3373)\n\tat com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:7344)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:2713)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.connectHelper(SQLServerConnection.java:2261)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.login(SQLServerConnection.java:1921)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.connectInternal(SQLServerConnection.java:1762)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.connect(SQLServerConnection.java:1077)\n\tat com.microsoft.sqlserver.jdbc.SQLServerDriver.connect(SQLServerDriver.java:623)\n\tat org.apache.commons.dbcp2.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:52)\n\tat org.apache.commons.dbcp2.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:374)\n\tat org.apache.commons.dbcp2.BasicDataSource.validateConnectionFactory(BasicDataSource.java:106)\n\tat org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:649)\n\tat org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:531)\n\tat org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:731)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.getConnection(JdbcIO.java:1503)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.processElement(JdbcIO.java:1516)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$WindowObservingProcessBundleContext.outputWithTimestamp(FnApiDoFnRunner.java:2214)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn.processElement(Read.java:320)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1096)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$1500(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:656)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:651)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:213)\n\tat org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver.multiplexElements(BeamFnDataInboundObserver.java:158)\n\tat org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:537)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:150)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:115)\n\tat java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)\n\tat java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)\n\tat org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)\n\tat java.base/java.lang.Thread.run(Thread.java:829)\n\n Worker ID: carga-erp-primavia-mp--16-08200921-2cn6-harness-5vzv,\n\n Root cause: org.apache.beam.sdk.util.UserCodeException: java.sql.SQLException: Cannot create PoolableConnectionFactory (Falha de logon do usuário 'wk_dac'. ClientConnectionId:b25c3fe0-0002-420f-9b54-67b09508c44c)\n\tat org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:39)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$WindowObservingProcessBundleContext.outputWithTimestamp(FnApiDoFnRunner.java:2214)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn.processElement(Read.java:320)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1096)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$1500(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:656)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:651)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:213)\n\tat org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver.multiplexElements(BeamFnDataInboundObserver.java:158)\n\tat org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:537)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:150)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:115)\n\tat java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)\n\tat java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)\n\tat org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)\n\tat java.base/java.lang.Thread.run(Thread.java:829)\nCaused by: java.sql.SQLException: Cannot create PoolableConnectionFactory (Falha de logon do usuário 'wk_dac'. ClientConnectionId:b25c3fe0-0002-420f-9b54-67b09508c44c)\n\tat org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:653)\n\tat org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:531)\n\tat org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:731)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.getConnection(JdbcIO.java:1503)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.processElement(JdbcIO.java:1516)\nCaused by: com.microsoft.sqlserver.jdbc.SQLServerException: Falha de logon do usuário 'wk_dac'. ClientConnectionId:b25c3fe0-0002-420f-9b54-67b09508c44c\n\tat com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:259)\n\tat com.microsoft.sqlserver.jdbc.TDSTokenHandler.onEOF(tdsparser.java:256)\n\tat com.microsoft.sqlserver.jdbc.TDSParser.parse(tdsparser.java:108)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.sendLogon(SQLServerConnection.java:4548)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.logon(SQLServerConnection.java:3409)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.access$100(SQLServerConnection.java:85)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection$LogonCommand.doExecute(SQLServerConnection.java:3373)\n\tat com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:7344)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:2713)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.connectHelper(SQLServerConnection.java:2261)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.login(SQLServerConnection.java:1921)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.connectInternal(SQLServerConnection.java:1762)\n\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.connect(SQLServerConnection.java:1077)\n\tat com.microsoft.sqlserver.jdbc.SQLServerDriver.connect(SQLServerDriver.java:623)\n\tat org.apache.commons.dbcp2.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:52)\n\tat org.apache.commons.dbcp2.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:374)\n\tat org.apache.commons.dbcp2.BasicDataSource.validateConnectionFactory(BasicDataSource.java:106)\n\tat org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:649)\n\tat org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:531)\n\tat org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:731)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.getConnection(JdbcIO.java:1503)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn.processElement(JdbcIO.java:1516)\n\tat org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$WindowObservingProcessBundleContext.outputWithTimestamp(FnApiDoFnRunner.java:2214)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn.processElement(Read.java:320)\n\tat org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn$DoFnInvoker.invokeProcessElement(Unknown Source)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1096)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner.access$1500(FnApiDoFnRunner.java:142)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:656)\n\tat org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:651)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:325)\n\tat org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:252)\n\tat org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:213)\n\tat org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver.multiplexElements(BeamFnDataInboundObserver.java:158)\n\tat org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:537)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:150)\n\tat org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:115)\n\tat java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)\n\tat java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)\n\tat org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)\n\tat java.base/java.lang.Thread.run(Thread.java:829)\n\n Worker ID: carga-erp-primavia-mp--16-08200921-2cn6-harness-5vzv",
"insertId": "8lvhbvch4h",
"resource": {
"type": "dataflow_step",
"labels": {
"job_id": "2023-08-20_09_20_00-17119339413247852494",
"job_name": "carga-erp-primavia-mp--1692548400-58782424111294090",
"region": "southamerica-east1",
"step_id": "",
"project_id": "167979788553"
}
},
"timestamp": "2023-08-20T16:24:35.090943627Z",
"severity": "ERROR",
"labels": {
"dataflow.googleapis.com/region": "southamerica-east1",
"dataflow.googleapis.com/job_name": "carga-erp-primavia-mp--1692548400-58782424111294090",
"dataflow.googleapis.com/log_type": "system",
"dataflow.googleapis.com/job_id": "2023-08-20_09_20_00-17119339413247852494"
},
"logName": "projects/dac-dados/logs/dataflow.googleapis.com%2Fjob-message",
"receiveTimestamp": "2023-08-20T16:24:36.742832022Z"
}
The error message you're encountering indicates an issue with the JDBC connection to your database. Specifically, the error "Cannot create PoolableConnectionFactory (Falha de logon do usuário 'wk_dac'. ClientConnectionId:91341482-d533-4097-aef4-e5b88dfe5d4f)" suggests that the user "wk_dac" faced a login failure.
Possible reasons for this include:
To address this:
Further steps to consider:
@ms4446 thank for addressing the troubleshooting. But look the cenario:
The Job runs with no error.
If I create the pipeline using the "+ IMPORT AS PIPELINE" from the jobs screen, using the same job I just run, I got the error INVALID_ARGUMENT.
If I create the pipeline from pipeline screen using the "CREATE DATA PIPELINE" I got the error "Cannot create PoolableConnectionFactory (Falha de logon do usuário 'wk_dac'. ClientConnectionId:91341482-d533-4097-aef4-e5b88dfe5d4f)"
The user wk_dac has access to the database because the job runs and the data is store in BigQuery.
To create the job or the pipeline I use a google template "SQL Server to Bigquery"
We are trying to use a different JDBC, but what makes me confuse is that just the job I can run with no problem.
Thank you again for your time and suggestions
Based on the details you've shared, it appears that the discrepancies arise from the method used to create the pipeline.
Using "Import as Pipeline" from the Jobs screen:
INVALID_ARGUMENT
error. There could be a bug or a missing configuration that's causing this issue.Using "Create Data Pipeline" from the Pipelines screen:
Cannot create PoolableConnectionFactory
error suggests there might be a discrepancy in the JDBC connection details or other configurations.To address these issues:
Re-import the Pipeline: Try importing the pipeline from the Jobs screen once more. Ensure that you meticulously review and specify all the configuration details from the original job.
Re-create the Pipeline: When using the "Create Data Pipeline" option from the Pipelines screen, double-check all configurations, especially those related to database connections, to ensure they match the successful job's settings.
If, after these steps, the issue persists, reaching out to Google Cloud Support might be the best course of action. They could provide insights into any known issues or specific configurations required for the template you're using.
@ms4446 I was still troubleshooting, and to workaround it I was running the job every day on my own, but yesterday when running the job and today too, I got new error
"org.apache.beam.sdk.util.UserCodeException: java.sql.SQLException: Cannot create PoolableConnectionFactory (The driver could not establish a secure connection to SQL Server by using Secure Sockets Layer (SSL) encryption. Error: "PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target". ClientConnectionId:e5bc7539-1271-4f62-8801-c9d7c129e35d)
I didin't find any job config to setup any certificate. Do you have any idea?
Hi @RonaldoSales,
The error message you're seeing is related to SSL encryption when trying to establish a connectio
n to the SQL Server. The specific error "PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target" indicates that the Java application (in this case, the Dataflow job) cannot validate or trust the certificate presented by the SQL Server.
Here are some steps to troubleshoot and potentially resolve the issue:
Trust the Certificate:
Update JDBC Driver:
Disable SSL (Not Recommended for Production):
encrypt=false
to the connection string. However, this is not recommended for production environments as it exposes your data to potential eavesdropping.Check SQL Server Configuration:
Java Version:
Custom Dataflow Image:
Remember, while disabling SSL can be a quick way to diagnose if SSL is the root cause, it's crucial to ensure all connections are encrypted, especially in production environments, to maintain data security and integrity.