Response Stream Exceeded Limit :: Pass data between Flows without a Listener

Using Ramp - Accounting - List Options for a given custom accounting field and running into response stream exceeded limit of 5242880 bytes. I have looked at a number of responses, here:

I have a series of flows where flow 1 triggers flow 2, etc. in order to insure that the data has completed before moving on to the next step. Where I am running into the 5MB limit is on my current step 5 (out of ~10); so, if I were to set this up as noted above, 5.1 would listen to 5.0. The issue I would have is that I cannot have 6 start until both 5.0 and 5.1 are complete. Is there a way to do what I need with a Listener (listen for the end of a a process???) or to stop the error without a listener or …? Thanks.

  • Can you explain your workflow a bit?
  • Why do you need to fetch the field options?
    • Could you pass query params to fetch less?
    • Could you transform the data after to drop fields you don’t need?
    • Could you put an output filter on the lookup to drop results you don’t need?
  • Do you have “Path to records” under non standard api response pattern set to data?
  • Do you have paging setup on this lookup?
  • What is your page size on the export of this flow?
  • What other things in the flow are being looked up and can you lookup less/return less fields/filter/transform to only include fields you need?

I tried it again with the preSaveHook dumping everything but the field I need … and it seems to be working. I will do some more testing in the AM, but I might have jumped the gun on this one. I had already started my responses prior to it working, so here are the answers in the event I need to come back to this tomorrow.

My workflow is to pull all of my data from my “CRM” and compare it to what is in Ramp. I need to insert new records and retire old ones. I’m currently doing this by pulling from my CRM into Azure and then pulling from Ramp into Azure and comparing everything to make sure I don’t have values in A that aren’t in B and vice versa.

I do have Path to records set to data.

I have paging set-up (Next URL and page.next)

Page size is set to default (20)

Glad the dropping of fields seems to have solved it. You may also reduce your page size to 10 or less if it happens again.

I was wrong, my transform was just obscuring the error (passing it to a no additional data is needed step, which is incorrect). I am still up against this problem. If, once it has failed (which 11 of the 16 records fail every time), I can say to retry an individual record and it succeeds (or at least it says it does):

Example of my retry data:

{
    "name": "Billable",
    "ramp_id": "af64b6f4-a165-4aa3-a11c-dec180162f7a",
    "is_active": true,
    "ramp_code": "Billable",
    "azure_id": 15,
    "azure_col_name": "Billable",
    "azure_is_active": true,
    "azure_ramp_code": "Billable"
}

My call in Ramp: /accounting/field-options?field_id={{record.ramp_id}}&page_size=20

This is all it should return (postman response):

{
    "page": {
        "next": null
    },
    "data": [
        {
            "id": "false",
            "visibility": "VISIBLE",
            "code": null,
            "ramp_id": "bf8120f5-155c-49a4-946d-20af19282b96",
            "is_active": true,
            "value": "false",
            "display_name": null,
            "updated_at": "2025-08-07T22:34:41+00:00",
            "created_at": "2025-08-07T22:34:41+00:00"
        },
        {
            "id": "true",
            "visibility": "VISIBLE",
            "code": null,
            "ramp_id": "fd4b2d29-b9b6-4e1a-802c-96ad8a811c9d",
            "is_active": true,
            "value": "true",
            "display_name": null,
            "updated_at": "2025-08-07T22:34:41+00:00",
            "created_at": "2025-08-07T22:34:41+00:00"
        }
    ]
}

But instead: The lookup operation exceeded the limit of 5242880 bytes. Reduce the page size or use a transformation to remove fields and optimize record structure.

I just don't understand ... I have set the Page Size to 1, I have tried a number of “solutions” that solved nothing. Please help.

This is now working (after setting Page Size at the Export (not Lookup) level to 1) …