Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Dialogflow CX Data Store Invalid Value Error

We used to be able to organize FAQ data into Structured Data in CX using BQ where we used SQL to generate and provide multiple examples including our actual data to be used as RAG in CX. I say used to because, recently we noticed that the reference was automatically removed from the FAQ documents connector, and when we try to re-add it, we get the following error:


 Invalid value at 'flow.knowledge_connector_settings.data_store_connections[1].data_store_type' (type.googleapis.com/google.cloud.dialogflow.v3alpha1.DataStoreType), "STRUCTURED"

This error makes no sense, because the structure of our data hasn't changed, vertex didn't have any issues processing the datastore as "structured" and this worked perfectly up until recently. 

Superdave_0-1729108589377.png

Then reference the datastore which is all green checkmarks and says "structured"

Superdave_1-1729108740518.png

hitting save gives me...

Superdave_2-1729108815331.png

And then it clears out the drop down for the datastore connection. Is it bugged?
I've even tried creating a new datastore, also structured, then tried binding it to the same CX instance, choose that in the drop down and it's the same result/error. 



 

Solved Solved
2 2 294
1 ACCEPTED SOLUTION

Hi @Superdave,

Welcome to Google Cloud Community!

It was observed by our engineers that there were errors related to this. A fix has been rolled out and this should be working on your end now.

Hope this helps.

View solution in original post

2 REPLIES 2

Im having this problem too, im using a  CSV file for FAQ data. I wonder if its related to the documentation, in DataStoreType theres no "STRUCTURED" type, only FAQ. Maybe it's a bug and structured datastores should be labeled as FAQ, but it's not happening. 

dzelt17_0-1729171553334.png

Hi @Superdave,

Welcome to Google Cloud Community!

It was observed by our engineers that there were errors related to this. A fix has been rolled out and this should be working on your end now.

Hope this helps.