Create export after Webhook

I need to extract paged data from an API, let say https://example.com/dataset_name?page=… but dataset_name I want to extract from webhook.

Problem is that, when I create the webhook, I can not add extract node, which receives data from webhook.
Any idea, how to solve this?

Just to make sure I understand correctly:
You have a webhook as the source, the webhook receives a json which also contains a node with the dataset name, and you want to use that node value in the next lookup/API call step.

For example, if the webhook payload contains:

{
  "record": {
    "dataNode": "dataValue"
  }
}

then in the next step you want to call:

https://example.com/dataValue?page=

Is that the scenario?

If yes, this should be fairly straightforward. You can reference the value from the webhook payload in the HTTP URI of the next step.

For example:

https://example.com/{{dataNode}}?page=

and for the paging you can use the Pagination setup which is in the HTTP step:

This will dynamically insert the value received in the webhook.

Hello Nuri,

I want to have EXPORT node next to webhook. Because I want to extract about 500 pages when this webhook is called.

When I do Lookup node, I am reaching 5MB limit. I was assuming, If I will use Export node, I can fetch 500 pages. But EXPORT node can’t see webhook data, only Lookup node can see.

You can perfectly do this in the second flow step, your EXPORT lookup.

So the 5 MB limit applies to the size of a single lookup response, not the total number of pages.

So 500 pages by itself is not a problem as long as each paginated response is under 5 MB. However, if a single page returns too many records or very large records and the response exceeds 5 MB, you can still hit the limit even when using pagination.

If a lookup response does hit the 5 MB limit, you can usually resolve it by reducing the response size, for example by limiting the fields returned or reducing the number of records per page.

These articles explain it in more detail:

Also see this guidance on tuning page size for better performance:

Yeah, this can be done in two ways.

  1. Have a flow with a webhook, then a lookup that gets all the pages you need; then the result maps all the looked-up data to the source record. You’d have an issue if your entire result set (looked up + initial webhook payload) exceeds 5 MB. You could get further by using transformations to drop fields you don’t need and only keep essential data, but you’d eventually hit some limit there as well.
  2. Alternatively, you can do this: Shopify Payout Transactions: response stream exceeded limit of 5242880 bytes - #2 by tylerlamparter