Get hands-on experience with 20+ free Google Cloud products and $300 in free credit for new customers.

Adding search parameters to Shopify connector

With the Shopify connector I am able to make paginated requests using a while-loop with the help of this very useful article.

My use case is to use a scheduled API trigger to run the workflow every day/week to pull in new orders and I can't figure out how to do that. I was trying the filterClause, but I believe this "only" filters the data after it was received from the connector?

The Shopify API allows you to use parameters, eg created_at_min or since_id, to only get back a subset of the orders. I would expect that the connector would somehow be able to handle such parameters, but I can't figure out how to do this or if I am missing something.

If you can point me in the right direction, I'd appreciate it!

Michael

Solved Solved
1 4 777
1 ACCEPTED SOLUTION

Hi @mwhitaker , 

Instead of the API Trigger, you could try using the Cloud Scheduler Trigger to start your integration.  This will allow you to specify a schedule for running the integration: every hour, every day, on the third Tuesday of the month, etc... Basically, you can use a cron configuration of the schedule.

Probably the trickest part of this, for a use case like this, is to keep the datetime stamps for when you last ran the Shopify query, so that you can set up your filterClause with the right criteria.  I suspect you want to do something like "Give me all the rows in this Shopify Entity that were created or updated since the last time I ran this query (e.g. one hour ago) (or since ID# x)".   To do this, you will need to have some state database that you can use to keep the last datetime or the last ID from when you ran the query last, and then be able to look it up to generate your filterClause, and then to write back to it once you have run the query to set everything up for the next time it runs.

If you already have a Database for other purposes on GCP, you could certainly just re-use some DB instance (for example, a cloudSQL Database or Spanner or BigQuery) and create a simple table to keep this datetime/ID info.  If you don't have anything already available, you could use FireStore, which is a nice lightweight cloud database available in GCP, and you can use the Firestore Connector Tasks to read and write from Firestore right from your Integration.  Any database will work...you can use whatever you are most familiar/comfortable with.

View solution in original post

4 REPLIES 4

Hi @mwhitaker , 

Instead of the API Trigger, you could try using the Cloud Scheduler Trigger to start your integration.  This will allow you to specify a schedule for running the integration: every hour, every day, on the third Tuesday of the month, etc... Basically, you can use a cron configuration of the schedule.

Probably the trickest part of this, for a use case like this, is to keep the datetime stamps for when you last ran the Shopify query, so that you can set up your filterClause with the right criteria.  I suspect you want to do something like "Give me all the rows in this Shopify Entity that were created or updated since the last time I ran this query (e.g. one hour ago) (or since ID# x)".   To do this, you will need to have some state database that you can use to keep the last datetime or the last ID from when you ran the query last, and then be able to look it up to generate your filterClause, and then to write back to it once you have run the query to set everything up for the next time it runs.

If you already have a Database for other purposes on GCP, you could certainly just re-use some DB instance (for example, a cloudSQL Database or Spanner or BigQuery) and create a simple table to keep this datetime/ID info.  If you don't have anything already available, you could use FireStore, which is a nice lightweight cloud database available in GCP, and you can use the Firestore Connector Tasks to read and write from Firestore right from your Integration.  Any database will work...you can use whatever you are most familiar/comfortable with.

Thank you @shaaland and sorry about the late reply. I think storing the last id pulled in Firestore will be the way to go.

Another thing that tripped me up is that the filterClause switches to camelCase unlike the Shopify API docs.

Finally, and this is unrelated to my original question, I have to remember to suspend the connector nodes since I only intend to do scheduled pulls once a day or week. Leaving the default 2 connector nodes on incurs unacceptably high charges. Learning with the pocket book...

Hi @mwhitaker It sounds like you're looking for a way to schedule API requests to pull in new Shopify orders using specific parameters like created_at_min or since_id. Here are a few suggestions that might help you achieve this:

Using Shopify API Parameters

  1. API Parameters:

    • You can use the created_at_min and since_id parameters directly in your API requests to filter the orders returned by Shopify. This will help you retrieve only the new orders since your last request.
  2. Scheduling API Requests:

    • To automate this process, you can use a scheduler to run your API requests daily or weekly. This can be done using cron jobs or cloud functions depending on your infrastructure.

Alternative Solution

If you're looking for a more streamlined and efficient way to handle this, you might consider using third-party connectors that simplify this process. For instance, Windsor.ai offers a robust Shopify connector that supports using parameters like created_at_min and since_id to filter your data. Additionally, it allows you to schedule your data extraction, ensuring that your workflow runs automatically and pulls in new orders as needed. Using such solutions can save you time and reduce the complexity of managing API requests manually.

Hope this helps!

Top Labels in this Space