23/09/2025 14:17:17 SYSTEM Celigo Realtime Import Restlet Runner
CUSTOMDEPLOY_CELIGO_RT_IMPORT_DEPLOY 9801 Celigo Service Account
SSS_TIME_LIMIT_EXCEEDED Script Execution Time Exceeded.; ID:
What I’ve tried:
Retrying all 35 errors at once → same error occurs again.
Retrying a single record → it succeeds without issues.
I don’t know if it the reason is a netsuite script or it has something to do with celigo. But the strange thing is that a single record works and when retrying all records not.
Hi @nuriensing, this error is usually seen when NetSuite (Restlet) takes more than 5 minutes to process all records in a batch. It could be because of multiple factors like
NetSuite is taking longer time to process each record
There are multiple sublists in each record which are taking more time to process
Temporary issue with NetSuite account
Can you monitor the flow for a few more runs and if the issue persists, try with a lesser batch size. If the issue still persists. please raise a support case and we can have a look at the account closely.
I’m also experiencing this issue on the payout records starting from November 17th. When I retry the errors, I continue to encounter the same problem. The sublist contains only one record. I have increased the concurrency for the NetSuite connection and even set up a new connection, but the issue still persists. Please share your thoughts.
I think this might be a potential fix, but please correct me if I’m wrong. It’s possible that too many API requests are being made to NetSuite. One idea to address this could be to set a batch size limit in the import step — for example, setting it to 7. This would group up to 7 records into a single API call, which can help reduce the number of requests and avoid overloading the NetSuite API.
The default batch size is 100 if nothing is set. The help text is pending an update to this so it's more clear:
Enter the number of records to import to NetSuite at a time. NetSuite's APIs only allows only a certain number of API calls at a time, and each call costs so-called “points” depending on the data, e.g., such as the number of records or the number of lookups configured per record. If too much data is sent at any given time in a single page, then NetSuite might return errors stating that you’'ve run out of points, and hence your flow will fail. You can avoid this by manageing the size of the data sent to NetSuite using the Batch size limit field setting, which. The batch size limit determines the number of records to be sent with each API call to NetSuite. This field setting is useful for certain flow configurations where it’'s not possible to manage the page size for the source data. One example is one-to-many lookups, where each record will result in an exponentially larger number of records being sent to the import. By setting a batch size limit, the results from the lookup will be sent to the import in batches, where the maximum batch size will be per this field's setting’s value, keeping your messages withinand hence, overloading the NetSuite governance. The default system value (when you leave this field setting blank) is 100.
Something else that I find helps this issue is to increase the concurrency of the NetSuite connection if possible. During the holiday season I am seeing this quite a bit especially with Shopify processes.