Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Dialogflow CX integration with BigQuery is not working

I've setup a new agent using Dialogflow CX and its receiving call and working fine but my question is that it is not storing the conversation history to BigQuery database.

- have enabled "Enable BigQuery export"

- the dataset and data table are already created in BigQuey as per documentation

the call is generating the transcript but its not being stored in BigQuery.

can someone suggest any solution or hint for the issue as I am running out of information from official documentation.

 

 

Solved Solved
0 15 1,154
1 ACCEPTED SOLUTION

Hi, it worked for me, see the image below. I had to create the dataset in the same region. not using multi-region datasets, check screenshot.

xavidop_0-1694676934598.png

xavidop_1-1694677007360.png

 

 

View solution in original post

15 REPLIES 15

Hi, 

You have to Enable interaction logging in the agent too.This is important to indicate whether you would like Google to collect and store redacted end-user queries for quality improvement.

these are the steps:

  1. Ensure that interaction logging is enabled.
  2. Follow the BigQuery dataset creation guide to create a dataset. Note the dataset name, as you will need this in the next step.
  3. Follow the BigQuery table creation guide to create a table with a SQL schema definition. Use the following SQL statement for creation:

     
    CREATE TABLE <your_dataset_name>.dialogflow_bigquery_export_data(
      project_id STRING
    ,
      agent_id STRING
    ,
      conversation_name STRING
    ,
      turn_position INTEGER
    ,
      request_time TIMESTAMP
    ,
      language_code STRING
    ,
      request JSON
    ,
      response JSON
    ,
      partial_responses JSON
    ,
      derived_data JSON
    ,
      conversation_signals JSON
    );
  4. Configure your agent settings to enable BigQuery export, and to provide the dataset and table names created above.

I already followed these steps while setting up the connections to BigQuery DB with Dialogflow CX as per this documentation https://cloud.google.com/dialogflow/cx/docs/concept/export-bq.

Also enabled "Enable interaction logging" along with "Enable Cloud Logging" to see if it generates any logs but its not working and not generating any logs too.

please suggest.

 

 

 

 

hi!,

is the BQ and the agent under the same project and region?

@xavidop , yes both are under same project and same region (US).

Agent region is (Global Serving, data-at-rest in US) 

DBQuery Dataset in multi region (multiple regions in US)

@xavidop 

is there a way to check integration logs to check what is happening inside?

Adding more details here.

this the table schema.

kundanugale_2-1694594328135.png

 

Agent configuration

kundanugale_1-1694594277518.png

 

 

are you seeing interactions stored in the conversation history tab??

yes conversation history is showing correctly. I can see all of them in conversation history tab. @xavidop , all , any help would be much appreciated.

 

do you have enabled interaction logging?

Some features, such as ExperimentsConversation historyBigQuery exportAnalytics, and NLU model improvement require this setting to be enabled, as they rely on traffic logs.

yes "interaction logging" is enabled as shown in earlier post (screenshot). if you have a working implementation or can make a short youtube video would be nice. 

Hi, it worked for me, see the image below. I had to create the dataset in the same region. not using multi-region datasets, check screenshot.

xavidop_0-1694676934598.png

xavidop_1-1694677007360.png

 

 

let me try that out. thanks a lot. much appreciated.

In the left tree view, do you see anything in "External Connections". just to verify same at my end. as currently I don't see anything under it. 

just to confirm, you selected Agent region at (Global Serving, data-at-rest in US)  right?

I have my agent in us-central1

Thanks a lot @xavidop , that was quite helpful. Its working for me now.

The issue was incorrect column names. each column was appended with its type (as seen in above screenshot) as I copied the column names as it is from documentation.

glad to hear that!