Processing of submitResponse did not return same record count as the current batch size

Good morning everyone,

Following the previous issue I had when trying to do bulk API calls to a Magento 2 REST API, I managed to win the round. However, I am now faced with an error that is apparently coming from integrator.io. I've found very little information about this error, so here I am.

So basically, I am using the HTTP connector to send Magento 2 Async Bulk REST API requests. I aggregate the data of 2 products into a single call to be processed in batch. Once the flow executes, it is marked as completed BUT these errors are listed.

When the request is sent to the platform, the jobs are created correctly in Magento 2. It's integrator.io that returns these errors. From the looks of it, it's not pleased with what Magento 2 REST API has returned

Thing is, Magento 2 API does not return the original data but instead returns an acknowledgment that the "jobs" were created (plus other pieces of information).

How can I manage this integrator.io error gracefully ?

Can you share the exact HTTP response body returned by Magento? Please be sure to mask any sensitive data.

Hi @scotthenderson

Here's what the response body is

{
"bulk_uuid": "a3237122-22e0-441c-87c0-c606d71d4514",
"request_items": [
{
"id": 0,
"data_hash": "eabcffba2ebf9015dfa0b4e6d8d30121e5dfb5f11c66b34a8fa8c4a84979a110",
"status": "accepted"
},
{
"id": 1,
"data_hash": "80c220a305ddc8883167be5a42fc6a77463fd9ab337af208317ccccf8f575d96",
"status": "accepted"
}
],
"errors": false
}

No worries about sensitive data :)

This is very simple to fix. You need to set the field 'Path to records in HTTP response body' to 'request_items'.

In the other post I mentioned you might need to do this, "You might actually need to use this field BTW. I am not sure what the response structures look like for Magento 2, but this is a pretty commonly used field for when the results are wrapped by a variable."

Here is the help text for the field too. "Use this field if the records returned by the API are not located at the root level of the HTTP response body. For example, an API could return records inside a container object like 'results.customers'."

Hi @scotthenderson

Yes I do remember reading you mention this field. I had initially tried to fill up the first three fields

  • Path to records in HTTP response body
  • Path to id field in HTTP response body
  • Path to error field in HTTP response body

But had errors while using them. Removing values in all three AND solely filling in the first one with request_items did fix my import ! Thx !

However, I'm now trying to understand which value must I input the other fields. To my understanding, here's my tentative mapping

And here's the error I get

Message: 
"could not extract id using path: id from resource: {\n id: 0,\n data_hash: 'eabcffba2ebf9015dfa0b4e6d8d30121e5dfb5f11c66b34a8fa8c4a84979a110',\n status: 'accepted'\n}"

Code: missing_id

Source: HTTP

Timestamp: 2022-04-18T17:59:22.346Z

Error ID: 449227288

Classification : Missing

Trace key : 7349540585690

I had previously tried bulk_uuid before realizing that this mapping is documented has been relative to the records returned. Hence my attempt at using id instead.

Thanks in advance

I would remove all of these fields that you are setting in this section, and the only field that you really need to set right now is 'Path to records in HTTP response body'.

After the above is working, then work on the error handling fields, and if you want help from the community, then please share an exact HTTP response for a request that contains errors.

Hi everyone ! I'm resurrecting this post because I'm now facing the count_mismatch error once again.

Previously, I was trying to group products with their variants to do batch inventory update. With Scott's help above, it worked fine. I had 42 products (some with or without variants) and a return from my Magento API that was acknowledging 42 bulk updates tasks.

Now, I'm back in this flow to try to reduce my number of bulk update tasks to 1 per page of data. Sadly, this means that request_items will now only list one bulk update task per page. This triggers Celigo's count_mistmatch error again

Here's what my original data looks like

{
"batch_of_records": [
{
"record": {
"variants": [
{
"variant_id": 40162751840304,
"inventory_quantity": 4
}
]
}
},
{
"record": {
"variants": [
{
"variant_id": 40162751840304,
"inventory_quantity": 4
}
]
}
},
{
"record": {
"variants": [
{
"variant_id": 40162751840304,
"inventory_quantity": 4
}
]
}
},
{
"record": {
"variants": [
{
"variant_id": 40162751840304,
"inventory_quantity": 4
}
]
}
},
{
"record": {
"variants": [
{
"variant_id": 40162751840304,
"inventory_quantity": 4
}
]
}
}
]
}

Here's how I'm transforming this data

[
{
"sourceItems": [
{
"sku": "2243SH40162751840304",
"source_code": "vendor_source_2243",
"quantity" : 4,
"status": 1
},
{
"sku": "2243SH40162751840304",
"source_code": "vendor_source_2243",
"quantity" : 4,
"status": 1
},
{
"sku": "2243SH40162751840304",
"source_code": "vendor_source_2243",
"quantity" : 4,
"status": 1
},
{
"sku": "2243SH40162751840304",
"source_code": "vendor_source_2243",
"quantity" : 4,
"status": 1
},
{
"sku": "2243SH40162751840304",
"source_code": "vendor_source_2243",
"quantity" : 4,
"status": 1
}
]
}
]

Here's what Magento returns

{
"bulk_uuid": "329f5363-14d2-4405-8231-53fabcdb7004",
"request_items": [
{
"id": 0,
"data_hash": "5da1af4191fdf640ca1a42e985fe6f78a6868eee2e654b25b6c2e9cf18cf1496",
"status": "accepted"
}
],
"errors": false
}

So ! How can I make it so that Celigo stops returning the count_mistmatch error ?

There are probably 2 ways to accomplish what you are trying to do.

The first option is to use a 'Pre Save Page' hook in your source export to combine all the records in the page into a single record (but with the exact JSON structure that you want), and then you can send it to Magento in your HTTP import as-is without setting batch size > 1, etc... The 'Pre Save Page' hook in flows is unique in that it allows you to merge records together, and return any number of records for the flow to process for that page. Saying this another way, the 'page size' property in your source export controls the number records that will get sent to your 'Pre Save Page' hook, but then your hook code is free to return a different number of records. For your specific use case, you should combine all the records in the page into a single record (i.e. data.length should equal 1 in your return array), and then your HTTP import does NOT need to set batch size > 1, since it will already be a single record that is already combined, and then the Magento single response will match the count if the num records in the page, etc...

The second option (assuming Magento async APIs behave like typical async APIs) is to use 'Async helpers' to handle all the nuances of working with async APIs. You should read our help guide here on this topic to see if this is perhaps a better solution than trying to use hooks and HTTP batching.