How Can I post data to the Microsoft Fabric Eventstream? I have connection to Fabric, but could not figured out how to use Eventstream?
It seems you might have to use custom endpoint configuration. What sort of data/apps are you planning to ingest there ?
Well, there is the native Fabric CeliGO integration, which supports just listing/adding/modifying eventstram itself but not posting to the Eventstream. I really wonder does anyone using CeliGo to create eventstream definitions?
Problem in the custom endpoint is, that
- It has a complicated auth mechanism, which we need to implement in CeliGo and normally integration platforms helping us to avoid this kind of hassles.
- Custom app has different endpoint, which means additional connection and most probably additional cost.
Is there any roadmap to include/enrich Microsoft Fabric integration?
Your points are valid — Eventstream ingestion sits on a separate Event Hubs–compatible endpoint with SAS auth, outside the unified Fabric REST API surface that the native connector uses. Before suggesting a path forward, one clarifying question:
What's the downstream destination? And if you can elaborate your use case :
-
If the goal is landing data in a Lakehouse table or Warehouse - We have a new data ingestion offering coming up soon.
-
If you specifically need streaming semantics (pub/sub, real-time transformations, Activator triggers), then Eventstream is the right tool and we can work around the ingestion gap with a pre-generated SAS token, a lightweight proxy, or by flipping the direction using Fabric's HTTP source connector.
-
I'll take the connector enhancement back as a roadmap input.
Thank you for a reply,
Yes, our goal is landing data in a Lakehouse table.
Currently, we are using workaround to upload data into the blob store as a csv files and run notebook to load csv files into lakehouse. This the only way, i could manage to get final data into the lakehouse, bit this approach is not efficient and I am looking a way to improve it.
Thanks.
Thanks for the clarity. We have registered the enhancement request to cover these API endpoints
Alternatively, the new data ingestion offering with Azure lakehouse support will be available soon. If you wish, we can schedule a demo for the same.