Import Mapping JSON to Table

Would someone please help me understand how to deal with JSON responses in the import mapper?

JSON has a variable schema, so one record/JSON Object might have 3 key values, and the next Object might only have 2.

If I map all 3 keys to database columns, then when that second object tries to insert into the table it shifts my 3rd column value to my second column (assuming the first or second key was missing from that second JSON Object).

Please help me understand how to deal with these types of changes that are common in JSON Responses.

It is difficult to help without seeing your flow configs. Can you share screenshots of your flow, the import settings drawer, the mappings drawer, etc...? Please also share screenshots of the resulting errors, or malformed data if possible?

At a high level, it is super common to map variable JSON to database tables, and it is very common for JSON fields to be missing, and it should just work; but maybe you are doing something unique and that is where pics will help set context.

Flow Configs

Flow

Import settings

The parsed Output from my export looks like this:

{
"page_of_records": [
{
"record": {
"payload": {
"Orders": [
{
"AmazonOrderId": "113-8158555-4786664",
"EarliestShipDate": "2022-01-06T07:59:59Z",
"SalesChannel": "Amazon.com",
...

Mappings

In this scenario, how do i deal with a JSON record that does not contain the "PaymentMethod" key or value?

Errors

Initially this problem did cause some unexpected errors complaining about incorrect data types when inserting into snowflake, but after reducing the number of columns, I was able to get the flow working, but the resulting data in the table shows that the missing keys in the JSON response caused a shift in the table data:

Here in snowflake we can see that for some records there was no ordertotal_currency code or ordertotal_amount, so the numberofitemsshipped was moved over to the left.

This shifting that you are reporting here is likely what caused the error in your other post. i.e. "Numeric value 'Other' is not recognized"

I will ask my team to see if there is a bug in our product when JSON fields being mapped are missing, and is it possible that we allow columns to shift.

In the meantime, you can work around this with any of the following solutions.

  1. See pic below. You can use one of the mapping options for when a value is missing.
  2. Instead o f#1, I would highly recommend mapping the entire JSON record into a single variant column, and then use Snowflake views to break the data into columns. This is way more flexible/powerful.
  3. You an use a preSavePage JavaScript hook on your source bubble in your flow to make sure that all fields always have a default value, else set one, etc...

Thanks.

Actually I started out wanting to do option #2, but I couldn't figure out how to do it (and I couldn't find the documentation to show me how)

Given my configuration how would I map the entire record to a variant?

I have created this table in snowflake:

CREATE OR REPLACE TABLE tuc_na1_p_as_orders (orders VARIANT);

Im not sure how to tell the mapping to select the whole record (maybe the word "this")?

Also I think I realize what you were saying in the other post, that jsonserialize is a helper, and that either this is the column to serialize or it is the entire record. Right?

Yeah, just copy exactly what I have in my screenshot above. We do have docs for this as well. Can you check out this page? https://docs.celigo.com/hc/en-us/articles/360049317752-Import-data-into-Snowflake

Thanks, that page is just what I was looking for.

Where is the keyword "this" explained. I didn't find it in the handlebar reference https://docs.celigo.com/hc/en-us/articles/360039326071-Handlebars-helper-reference

Also, not sure if this will help, but instead of doing "one to many" in the Snowflake import bubble, you can also specify the path to the individual records in your Amazon source export bubble, and then no need to do "one to many" or deal with arrays of orders in the imports, etc...

i.e. Open your source export and look for this field

Thanks again!

I like this method much better.