FAQ: Convert a blob key to base64 string for import?

There is currently no way to convert a blob key to base64 in integrator.io natively. However, you can use an Amazon S3 bucket in conjunction with an AWS Lambda function as a workaround. This requires two flows, not because the flows are complex, but because integrator.io runs so quickly that Amazon S3 won’t have time to convert the blob before integrator.io pulls the new file. You’ll need:

Flow 1: Transfer file from FTP to Amazon S3

  1. First, create an FTP export.
    1. Do not parse your file. In this example, the exported file is a PNG.
  2. Import (transfer) your file to Amazon S3.
    1. Do not generate a file from your record.
    2. Set the Blob key path field in the Advanced settings to blobKey.

Transfer file to Amazon S3

In this example, the Amazon S3 file key (file name) is a handlebar expression that replaces the S3 file key (file name) with the generated blob key. For example, if your blob key is 234k. The S3 file key (name) becomes 234k.png.

Create an AWS Lambda function that is invoked from uploads into Amazon S3 storage

In AWS, use the Lambda feature to create a function that converts the blob key to base64. The function needs to be triggered by an S3 upload. To configure this, use an Amazon S3 trigger to invoke a Lambda function.

Function overview

Trigger configuration

Tips:

  • Use a separate bucket for function outputs.
  • Customize the trigger by setting the suffix to the file type you’re uploading in the first flow. In this case, the file ends with .png. So you can set the suffix to png.

Warning: The code below is an example of converting the file. It has not been validated or endorsed by Celigo. Use this at your own risk.

import boto3
import base64
import ison

def lambda_handler (event, context):
s3_bucket = event [ 'Records '][0][ 's3'][ 'bucket ' ][ 'name' ]
s3_key = event[ 'Records '][0]['s3']['object ]['key']

s3 = boto3.client('s3')

response = s3.get_object (Bucket=s3_bucket, Key=s3_key)
file_content = response [ 'Body'].read()

base64_content = base64.b64encode (file_content) .decode('utf-8')
print (base64_content)

json_ data = {
"filename": s3_key,
"base64encoded": base64_content
}

json_string = json.dumps (json_data)

new_s3_key = s3_key + '.json'
print (new_s3_key)

s3.put_object (Body=json_string, Bucket=s3_bucket, Key=new_s3_key)

# Return the new S3 file key
return new_s3_key

Flow 2: Get the new file from Amazon S3

  1. Create your Amazon S3 export. Parse the JSON file.
  2. Import your new file into any application.

Are there other ways to do this. Is there something on the roadmap?

Hi @maartenpanman

Thank you for reaching out. At present, its not on the near-term roadmap.

Best Regards,

I built a different way to do this which is all in Celigo. If you go to the marketplace, you can find it under "Blobkey Conversion Helper".

With this being said, you normally do not need to use this. Normally, if you need to send base64 to your endpoint, you'd do something like this if you already have a blobKey:

If you need to convert base64 to blobKey, then you normally would use the "Look up additional files (per record)" option for your lookup and then maybe set the "Path to file in HTTP response body" if the base64 string isn't send in the raw response body. This will give you a blobKey to use.

1 Like