trigger.dev icon indicating copy to clipboard operation
trigger.dev copied to clipboard

[TRI-1243] Airtable webhooks

Open matt-aitken opened this issue 2 years ago • 7 comments

This requires these changes first:

1. Upgrade Graphile worker to 0.14.0 (https://github.com/triggerdotdev/trigger.dev/issues/396))

2. Add batch event support (https://github.com/triggerdotdev/trigger.dev/issues/348))

From SyncLinear.com | TRI-1243

matt-aitken avatar Sep 07 '23 10:09 matt-aitken

@nicktrn this one is for you now

matt-aitken avatar Oct 02 '23 16:10 matt-aitken

Any ETA on this?

patrykmaron avatar Oct 18 '23 22:10 patrykmaron

Any ETA on this?

Could be next week if all goes well! PR for batched events and Airtable will be in later today, but #524 will have to be merged first. There are still some things to test and finalize re DB migrations.

@patrykmaron On another note, do you have a use-case example for this? How do you intend to process the payloads?

From my own testing they are very unpleasant to work with - even with batching enabled. Lots of useless data, painful to drill down into nested props you actually need. (We discussed the concept of pluggable "payload transformers", maybe that could help here.)

Would greatly value your feedback!

nicktrn avatar Oct 19 '23 08:10 nicktrn

@nicktrn They are unpleasant, I was wondering if trigger.dev was going to come up with some solution for the payloads 😂

Our use case, is that we need to sync an table from AirTable with our MySQL db. (business requirement). But just playing around with webhooks from Airtable the experience been horrible, I need to figure out how to process the payload. Was going to use hookdeck to manage webhooks from Airtable but it's just pings, instead of data delivery.

Currently I see that the payload response just grows? And you meant to paginate through it? Now I have to also hold the timestamp for the last processed payload it seems.

Honestly I think it's simpler to cron job an CSV download of the table from AirTable and process that data into SQL at this point...

patrykmaron avatar Oct 19 '23 09:10 patrykmaron

The good news is that the integration already takes care of fetching the actual payloads for you!

https://github.com/triggerdotdev/trigger.dev/blob/c8aaea8ad0e484e7845d52ee8d1eced5818ea642/integrations/airtable/src/webhooks.ts#L389-L390

The payloads will then be delivered in batches, i.e. as an array of payloads. How we batch will be based on time intervals and/or maximum batch size (TBD).

You could upsert the latest baseTransactionNumber to keep track of things on your end.

I personally would stay away from those webhooks unless you really need to sync changes immediately.

Did you consider using the getRecords() task on a schedule of say once a day? Maybe you could just store daily snapshots in the DB. Or do a diff and update the previous records. (Sorry, no helpers for that!)

nicktrn avatar Oct 19 '23 10:10 nicktrn

I personally would stay away from those webhooks unless you really need to sync changes immediately.

This is the conclusion I have arrived at!

I will be doing an cron job and check which records have been modified using their last_modified field. Perhaps at an 15min or less interval. Maybe once a day like you said.

Weird implementation by the AirTable team, it's not an standard implementation am I correct?

patrykmaron avatar Oct 19 '23 14:10 patrykmaron

Weird implementation by the AirTable team, it's not an standard implementation am I correct?

No, not at all. That's not how webhooks should work.

Feel free to open an issue if you run into any trouble with those scheduled tasks. Best of luck :pray:

nicktrn avatar Oct 19 '23 16:10 nicktrn