Edit

Share via


Add and manage an event source in an eventstream

After you create a Microsoft Fabric eventstream, you can connect it to various data sources and destinations.

An eventstream doesn't just allow you to stream data from Microsoft sources. It also supports ingestion from third-party platforms like Google Cloud and Amazon Kinesis with new messaging connectors. This expanded capability offers seamless integration of external data streams into Fabric. This integration provides greater flexibility and enables you to gain real-time insights from multiple sources.

In this article, you learn about the event sources that you can add to an eventstream.

Prerequisites

  • Access to a workspace in the Fabric capacity license mode or trial license mode with Contributor or higher permissions.
  • Prerequisites specific to each source that are documented in the following source-specific articles.

Supported sources

Fabric eventstreams with enhanced capabilities support the following sources. Each article provides details and instructions for adding specific sources.

Source Description
Azure Data Explorer (preview) If you have an Azure Data Explorer database and a table, you can ingest data from the table into Microsoft Fabric by using eventstreams.
Azure Event Hubs If you have an Azure event hub, you can ingest event hub data into Fabric by using eventstreams.
Azure Event Grid (preview) If you have an Azure Event Grid namespace, you can ingest MQTT or non-MQTT event data into Fabric by using eventstreams.
Azure Service Bus (preview) You can ingest data from an Azure Service Bus queue or a topic's subscription into Fabric by using eventstreams.
Azure IoT Hub If you have an Azure IoT hub, you can ingest IoT data into Fabric by using eventstreams.
Custom endpoint (that is, custom app in standard capability) The custom endpoint feature allows your applications or Kafka clients to connect to eventstreams by using a connection string. This connection enables the smooth ingestion of streaming data into eventstreams.
Azure IoT Operations Configure Azure IoT Operations to send real-time data directly to Fabric Real-Time Intelligence by using an eventstream custom endpoint. This capability supports Microsoft Entra ID or SASL authentication.
Sample data You can choose Bicycles, Yellow Taxi, Stock Market, Buses, S&P 500 companies stocks, or Semantic Model Logs as a sample data source to test the data ingestion while setting up an eventstream.
Real-time weather (preview) You can add a real-time weather source to an eventstream to stream real-time weather data from various locations.
Azure SQL Database Change Data Capture (CDC) You can use the Azure SQL Database CDC source connector to capture a snapshot of the current data in an Azure SQL database. The connector then monitors and records any future row-level changes to this data.
PostgreSQL Database CDC You can use the PostgreSQL CDC source connector to capture a snapshot of the current data in a PostgreSQL database. The connector then monitors and records any future row-level changes to this data.
HTTP (preview) You can use the HTTP connector to stream data from external platforms into an eventstream by using standard HTTP requests. It also offers predefined public data feeds with autofilled headers and parameters, so you can start quickly without complex setup.
MongoDB CDC (preview) The MongoDB CDC source connector for Fabric eventstreams captures an initial snapshot of data from MongoDB. You can specify the collections to monitor, and the eventstream tracks and records real-time changes to documents in selected databases and collections.
MySQL Database CDC You can use the MySQL Database CDC source connector to capture a snapshot of the current data in an Azure Database for MySQL database. You can specify the tables to monitor, and the eventstream records any future row-level changes to the tables.
Azure Cosmos DB CDC You can use the Azure Cosmos DB CDC source connector for Fabric eventstreams to capture a snapshot of the current data in an Azure Cosmos DB database. The connector then monitors and records any future row-level changes to this data.
SQL Server on Virtual Machine Database (VM DB) CDC You can use the SQL Server on VM DB CDC source connector for Fabric eventstreams to capture a snapshot of the current data in a SQL Server database on a VM. The connector then monitors and records any future row-level changes to the data.
Azure SQL Managed Instance CDC You can use the Azure SQL Managed Instance CDC source connector for Fabric eventstreams to capture a snapshot of the current data in a SQL Managed Instance database. The connector then monitors and records any future row-level changes to this data.
Fabric workspace item events Fabric workspace item events are discrete Fabric events that occur when changes are made to your Fabric workspace. These changes include creating, updating, or deleting a Fabric item. With Fabric eventstreams, you can capture these Fabric workspace events, transform them, and route them to various destinations in Fabric for further analysis.
Fabric OneLake events You can use OneLake events to subscribe to changes in files and folders in OneLake, and then react to those changes in real time. With Fabric eventstreams, you can capture these OneLake events, transform them, and route them to various destinations in Fabric for further analysis. This seamless integration of OneLake events within Fabric eventstreams gives you greater flexibility for monitoring and analyzing activities in OneLake.
Fabric job events You can use job events to subscribe to changes produced when Fabric runs a job. For example, you can react to changes when refreshing a semantic model, running a scheduled pipeline, or running a notebook. Each of these activities can generate a corresponding job, which in turn generates a set of corresponding job events. With Fabric eventstreams, you can capture these job events, transform them, and route them to various destinations in Fabric for further analysis. This seamless integration of job events within Fabric eventstreams gives you greater flexibility for monitoring and analyzing activities in your job.
Fabric capacity overview events (preview) Fabric capacity overview events provide summary-level information about your capacity. You can use these events to create alerts related to your capacity health via Fabric Activator. You can also store these events in an eventhouse for granular or historical analysis.
Azure Blob Storage events Azure Blob Storage events are triggered when a client creates, replaces, or deletes a blob. You can use the connector to link Blob Storage events to Fabric events in a real-time hub. You can convert these events into continuous data streams and transform them before routing them to various destinations in Fabric.
Google Cloud Pub/Sub Google Pub/Sub is a messaging service that enables you to publish and subscribe to streams of events. You can add Google Pub/Sub as a source to your eventstream to capture, transform, and route real-time events to various destinations in Fabric.
Amazon Kinesis Data Streams Amazon Kinesis Data Streams is a massively scalable, highly durable data ingestion and processing service that's optimized for streaming data. By integrating Amazon Kinesis Data Streams as a source within your eventstream, you can seamlessly process real-time data streams before routing them to multiple destinations within Fabric.
Confluent Cloud for Apache Kafka Confluent Cloud for Apache Kafka is a streaming platform that offers powerful data streaming and processing functionalities by using Apache Kafka. By integrating Confluent Cloud for Apache Kafka as a source within your eventstream, you can seamlessly process real-time data streams before routing them to multiple destinations within Fabric.
Apache Kafka (preview) Apache Kafka is an open-source, distributed platform for building scalable, real-time data systems. By integrating Apache Kafka as a source within your eventstream, you can seamlessly bring real-time events from Apache Kafka and process them before routing them to multiple destinations within Fabric.
Amazon MSK Kafka Amazon MSK Kafka is a fully managed Kafka service that simplifies setup, scaling, and management. By integrating Amazon MSK Kafka as a source within your eventstream, you can seamlessly bring the real-time events from MSK Kafka and process them before routing them to multiple destinations within Fabric. 
MQTT (preview) You can use Fabric eventstreams to connect to an MQTT broker. Messages in an MQTT broker can be ingested into Fabric eventstreams and routed to various destinations within Fabric. 
Cribl (preview) You can connect Cribl to an eventstream and route data to various destinations within Fabric. 
Solace PubSub+ (preview) You can use Fabric eventstreams to connect to Solace PubSub+. Messages from Solace PubSub+ can be ingested into Fabric eventstreams and routed to various destinations within Fabric. 

Note

An eventstream can support up to 11 combined sources and destinations only when you're using the following types:

  • Source: Custom endpoint.
  • Destinations: Custom endpoint and eventhouse with direct ingestion.

Any sources or destinations not included in the preceding list, and destinations not appended to the default stream, don't count toward this limit.