Upload data as Microsoft LakeHouse TABLE

Any idea, how to upload data as a LakeHouse TABLE? We are able to upload as a LakeHouse FILE, but could not realize how to upload as a LakeHouse TABLE.

We have Microsoft Fabric environment, where we have created the workspace and the Lakehouse in this workspace.

In this Lakehouse there are two types of objects - FILES and Delta Lake Tables. Files can be linked to the blob storage and when we upload json files to the blob storage we see this files in the Lakehouse as FILES.

Our data restitution team requires us to upload data as TABLES - Delta lake tables, not as FILES.

How can we achieve using Celogo platform?

@tylerlamparter , any suggestion?

Hey @Otar_Melia

The core issue is that writing to a Delta Lake Table (vs. a File) in Fabric Lakehouse requires data to land natively in OneLake, not just an external blob container. Here's the path that should work using Celigo's HTTP connector:

Step 1: Write the file to OneLake via ADLS Gen2 REST API

OneLake exposes an ADLS Gen2-compatible REST endpoint, so even without a native ADLS Gen2 connector you can write files directly to it using Celigo's HTTP connector:

PUT https://onelake.dfs.fabric.microsoft.com/{workspaceId}/{lakehouseId}/Files/{yourfile.csv}

Auth is Azure AD OAuth 2.0 — Celigo's HTTP connector supports this natively.

Step 2: Call the Fabric Load Table API

Once the file is in the Lakehouse Files section, trigger the Load Table API to convert it to a Delta table:

POST https://api.fabric.microsoft.com/v1/workspaces/{workspaceId}/lakehouses/{lakehouseId}/tables/{tableName}/load
{
  "relativePath": "Files/{yourfile.csv}",
  "pathType": "File",
  "mode": "Append",
  "formatOptions": { "format": "Csv", "header": true }
}

This is an async operation — the response returns an operation ID you can poll to confirm completion.

The two-step HTTP approach above is the most self-contained path within Celigo today. Happy to help configure the HTTP connector setup if you get stuck on the OAuth flow.