@harishbakka I reviewed the ticket and the meeting recording and have a solution for you. This is actually the solution we use for our own integration apps.
To summarize, you have an export that is getting Shopify payouts, then you have a lookup step that is getting the transactions for the exported payout id, and then lastly you are wanting to put each transaction into Google BigQuery. The issue you're running into is a 5 MB page size limit due to having so many transactions for a given payout (a good problem to have since sales should be good!).
To solve for this, you'll need 2 flows. The first flow will grab the payouts, insert the payout data to Google BigQuery, then have a lookup step at the end to get the transactions. On the ending lookup step, you'll have a preSave page script that makes an api call to the second flow where it sends the returned transaction data from the lookup. The second flow will then receive the incoming transactions and insert them into Google BigQuery. The reason this works is because the preSave script will end up returning empty objects because they are no longer needed for flow 1 and they've already been sent to flow 2. Since the script is returning empty records, it never hits the 5 MB page limit.
Here is the setup below. Note, you'll need to use your own export id which can be found on the IO listener (not to be confused with a webhook (choose Celigo integrator.io and choose listen)) and update the script with your id. Additionally, disregard me using Celigo integrator.io bubbles on everything expect the listener. I just used them as placeholders, using the mirror endpoint, for real Shopify and Google BigQuery bubbles.
import {connections,integrations,exports,imports,flows,request} from 'integrator-api';
function preSavePage (options) {
exports.run({_id: '6668f68176867e29a18f722e', listenerData: options.data});
let output = [];
options.data.forEach((d) => {
output.push({});
})
return {
data: output,
errors: options.errors,
abort: false,
newErrorsAndRetryData: []
}
}