We have several flows that use a webhook listener as the source with Netsuite and S3 backup as the remaining steps. We keep running into issues where the flow runs incredibly slowly and don’t resolve themselves until there is a big enough gap where no new records are sent.
eg we had a flow yesterday that gets shipments from one of our warehouses and creates the item fulfillment in Netsuite (with some lookups to Netsuite in the flow to source the tracking links from a custom record).
It took pretty much all day to complete, and finally caught up at 5am after 7 hours of no new shipments.
I’m starting to think that a real time flow is an inefficient way of working this and we need to re-engineer this -
My current thinking is to split the flow from -
Current
1 flow to do everything
Source - webhook listener (with hook to split records per tracking number)
This then branches to -
Branch 1 - Netsuite lookups to get the tracking number + create fulfillment record
Branch 2 - Store JSON backup in S3
As this uses a webhook listener, this happens in real time continuously
Proposed
Split to 2 flows
Flow 1 will still have the webhook listener, but then will just have branch 2 from above to store the JSON in S3
Flow 2 will then do the branch 1 steps from above, but on a schedule and the source will be the S3 bucket from flow 1
Has anyone had similar issues? Is a scheduled flow a much more sensible way of doing this?
Thanks!