I am working on a proof of concept to create a unified search for my organization. We have some of our data available via REST API.
How do I go about setting up a data store to import data from a given end point every week ?
Is there any sample code that someone can point me to do below ?
1. Create a data store
2. Authenticate to this data store via REST API (preferably python)
3. Import data to this data store via REST API weekly (preferably via a cloud function)
Thanks in advance!
Solved! Go to Solution.
Hello aideveloper,
Welcome to Google Cloud Community!
To create a data store and ingest data for search, go to the section for the source you plan to use here in create a search data store.
This document describes how to authenticate to Vertex AI Agent Builder programmatically. How you authenticate to Vertex AI Agent Builder depends on the interface you use to access the API and the environment where your code is running.
Here is an example on how we can import data weekly. You can create data stores from BigQuery tables in two ways:
I hope the above information is helpful.
Hello aideveloper,
Welcome to Google Cloud Community!
To create a data store and ingest data for search, go to the section for the source you plan to use here in create a search data store.
This document describes how to authenticate to Vertex AI Agent Builder programmatically. How you authenticate to Vertex AI Agent Builder depends on the interface you use to access the API and the environment where your code is running.
Here is an example on how we can import data weekly. You can create data stores from BigQuery tables in two ways:
I hope the above information is helpful.
Thank you for your response!
User | Count |
---|---|
2 | |
1 | |
1 | |
1 | |
1 |