Use blobKey in a flow it was not created in

Hello! Is it possible to use a blobKey that was created in a different flow? I have the following flow structure:

Flow 1: Preps staged file data for upload into Shopify. The flow retrieves the file(s) data from the source app (Salsify in this case) and the file is retreived and stored in Celigo and I get the blobKey. I retrieve the file data here because I have to provide the computed file size to Shopify, which passes this data to their Google storage to generate a policy for upload and Google will reject it if it doesn’t get the file it expects so it has to be very specific. The file data is then staged to Shopify and I pass the record data to a second flow, Flow 2. This works great.

Flow 2: Listens for the record data from Flow 1, pushes the data to Shopify’s Google storage, and then uploads the final file data into Shopify. The files need to be uploaded to Google storage as form-data so the blobKey from Flow 1 is used to send the blob data as a form-data attachment. However, I get a “Could not locate the blob object with blobKey“ error when uploading to Google storage.

I had this set up originally in a single flow and it worked great, but I was testing with a single file. I can receive up to 7 files per record from Salsify which made the data structure messy, and Shopify uses 2 separate Google storage buckets for video and image uploads which forced me to branch and the data structure was not providing me the data I needed for selection. There wasn’t a way to rearrange the data so it paged differently so I opted for the chained flows to simplify the data structure so I could branch and upload the files to Google.

So, all that being said, I know that the blobKey reference works in a single flow but when I reference it in the second flow it doesn’t have any awareness of it. Can you use a blobKey in a flow separate from the one it was generated in?

Thanks!

No, unfortunately i don’t think this is possible from my experience you cannot; blobKeys are only useful in same flow execution. if i was in your shoe i would try using a single flow with branching that will allow you to use blobkeys

@Garfield_Grant is correct, blobKeys are actually tied to the job and it's execution. you should still be able to get this into 1 flow, but you may need to transform your data and/or use one-to-many on google apis upload step.

@tylerlamparter @Garfield_Grant Thanks for the confirmation. I figured as much but wasn’t certain. Unfortunately the data that we receive is a mess, I’ve tried every way possible to transform it and use one-to-many in a single flow with branching but I haven’t been successful, hence the second flow. The data we receive is essentially multiple records in a record:

{
  "key1" : "[val1, val2, val3, ..., valn]",
  "key2" : "[val1, val2, val3, ..., valn]",
    ...
  "keyn" : "[val1, val2, val3, ..., valn]"
},
{
  "key1" : "[val1, val2, val3, ..., valn]",
  "key2" : "[val1, val2, val3, ..., valn]",
    ...
  "keyn" : "[val1, val2, val3, ..., valn]"
},
... etc.

So each record are strings of arrays that need to be converted into arrays and then parsed into single records per index per line, which I’ve done via a javascript transform, but the problem is that the parsed values are stored as sub-array on the main record and I don’t know how to transform them into top-level record objects. The issue is the file types (both video and image) aren’t accessible when branching because they’re in the sub-array and you can’t do one-to-many there.

I’ve thought of pre-processing the data and staging it for retrieval as well, but the second flow is working with my workaround for the data.

I appreciate the help, thanks!

The best place to transform data after lookups will be transforms on the lookups/imports, response mapping to place the data where it could work, or using a postResponseMap script. The postResponseMap script will give you the most flexibility to bring data up to a top-level object array so that it can be used for a one-to-many later. This is where you could try Celigo AI to build the script for you by telling it what you want and what output you want.

2 Likes

Hey @tylerlamparter, thanks for the explanation. Transforming from the source is exactly what I was doing, and the data was restructured as a top-level array. The issue was that there are multiple types in the array and the values needed for the branch conditions are in those sub-objects, so there wasn’t anything to branch off of and branching doesn’t support one-to-many.

Regardless, I was able to get the source data reworked and split up so it’s easier to manage and I can type the data at the top-level object now. All is good.

Thanks for all the help!

1 Like