Improve Import Performance in Microsoft SQL

Is there a better way to import data into Microsoft SQL server OR Azure SQL for the matter? At the moment the records are being inserted one by one, rather we would like to explore if there are options to import records in bulk/ batches.

@tylerlamparter Sorry to tag you, but thought you may want to guide here :)

Hi @vreddhibhat,

We plan to add metadata support for Microsoft SQL server (cloud-hosted) in the upcoming release. As a part of this feature, you can use bulk insert and map source data with destination columns seamlessly.

Please look out for our release notes for more details.

Best Regards

Thank you @rohitprasad. Is there an estimated timeline for this release?

Hi @vreddhibhat,

This feature is planned for the September release, so it would be available on or around 26th September depending on the region (NA/EU).

Please let us know if you have any follow-up questions.

Best Regards,

@vreddhibhat, @rohitprasad's answer is the best and the release isn't too far away. In case you're curious on how you could do it now, you could have a presave page script on the export side that converts a whole page of records into a single record with an array of "lines". So the input of the presave page would be [{},{},{},{},{},{},{}] and the output would be [{"record_data":[{},{},{},{},{},{},{}]}]. After that, you can use an {{#each}} handlebar expression within the SQL query editor to loop through the array and shove in multiple values.

Thank you @tylerlamparter. Doesnt the {{#each}} loop introduce the same performance hiccup?

@vreddhibhat no because you would be making 1 insert statement with multiple values per page of data versus 1 insert statement for each value/record of data. Here is an example:

insert into database.schema.table (field1, field2) 
values {{#each record.record_data}}
({{field1}},{{field2}})
{{#if @last}}{{else}},{{/if}}
{{/each}}

@tylerlamparter Is it ok for you to help me with the pre-save script that you referred to earlier where in the idea of converting the records in a page to a single array of records. I am unable to test it easily. Just wondering whether you have a readily available JS code.

@rohitprasad May I know whether the release of bulk insert and data-mapping is going to be released this week as per original plan?

@vreddhibhat the presSavePage script would look like this on the export step.

function preSavePage (options) {
let recordsArray = [{recordsArray: options.data}];
return {
data: recordsArray,
errors: options.errors,
abort: false,
newErrorsAndRetryData: []
}
}

The SQL Server enhancement is scheduled to release on Thursday.