loader

(Real) Time After Time: Getting started with Real Time Intelligence in Microsoft Fabric

I’ve recently been studying to take the new DP-700 Fabric Data Engineering Associate Exam. In a recent blog I looked at Kusto Query Language or KQL but in terms of Fabric and DP-700 you can’t talk about KQL without talking about Real Time Intelligence.

Real Time intelligence is a suite of tools in Microsoft Fabric which which collectively handle real time data ingestion and analytics. Real time data can be described as any continuous time based streams of data. This could include weather data, financial data, transaction data, log and telemetry data or anything else which produces continuous timed events.

Real time intelligence is made up of five main parts

  • EventStream - Ingest & Process real time streams of data
  • Eventhouse - Workspace of KQL databases to store event data.
  • Activator - Trigger automated actions when certain conditions are met
  • KQL Queryset - Query data stored in a KQL database using KQL
  • Real-Time Dashboard - Integrate a dashboard with a KQL database to visualise real time data

Create an Eventhouse

First you need a Fabric workspace. You can then create an Eventhouse by creating a new item in your workspace and selecting Eventhouse. You will need to choose a name and click create.

Create EventhouseOnce you have created an Eventhouse a KQL database with the same name will be automatically created.

You can create objects in your KQL database by clicking new. You can also click Get Data to load a database with data (one-time load).

New Eventhouse ObjectWhen you have clicked get data you need to select a source. I will select Azure Storage.

Eventhouse Data SourcesOnce you select a source you will need to add URI and any other config information such as folder path and file extension. You can also select the destination table, either an existing table or a new one.

Azure Storage SourceYou can then inspect the data and click finish if you are happy.

Create a KQL Queryset

Create a new item in your workspace and select KQL Queryset.

Queryset Item TypeYou will then have have an IDE like view with which to create KQL queries. You can create a query or series of queries and run them on your data to see the results. When you are happy you can save the queryset.

If you want more information about KQL and how to write KQL queries check out my previous blog here.

kql querysetCreating an EventStream

Create a new item in the workspace and select the EventStream item type.

EventStream Item TypeYou can then select your data source. I will select the Stock Market sample data.

stock market sourceYou should then be able to see a visual representation of the EventStream. You can add Transform events activities to the EventStream. For example you can add a Manage Fields activity. This will let you select fields, drop fields and modify names and data types.

Event Stream ActivitiesTo change type you need to add a field to the manage fields activity and set the change type slider to Yes. You can then add a converted type.

Manage Fields Change Type

The other available transform events are:

  • Filter - Take events based on a field value or condition and leave the rest
  • Aggregate - Calculate an aggregation (like sum or average) each time a new event occurs.
  • Join - Join events with different fields based on a matching key or time
  • Group By - Calculate aggregations based on fixed or overlapping time windows and group them based on a field or set of fields.
  • Union - Combine events with shared fields
  • Expand - Split an array into multiple new fields, one for each array value

Once you have performed any transforms you must then select a destination for your EventStream. This may be a new or existing table in an Eventhouse, or Lakehouse. You can also have custom endpoint, stream and activator destinations. 

Create a Real-Time Dashboard

Create a new workspace item and choose the Real-Time Dashboard item type.

Real Time Dashboard Item TypeCreate a tile. You will then need to select a KQL database and write a KQL query for the tile to be based on. I have written a query to return average of Bid Price grouped by sector.

 Real Time Dashboard Query-1

Once you have a query you can create a visual based on the query and then it will be displayed in your dashboard. There are many visual types you could expect to see such as bar, column, line, scatter, pie etc.

Real Time Dashboard VisualCreate An Activator

Create a new item in your workspace and choose the Activator item type. Once you have created an activator you need to select Get Data.

Activator Item TypeYou can select from various sources or at the bottom is your existing event streams. Other possible sources are 3rd party event streams such as kafka or kinesis, blob storage/one lake events or fabric workspace events. I will select my existing EventStream.

Activator SourcesOnce you have events streaming to activator (it can take a few moments for things to sync) you can create rules that send alerts via email or teams. Here I have created an alert for the column bidPrice if the where if the value is greater than 2000 a message will be sent. Once you have created a rule it should show in the preview how many events in the last hour would have sent an alert. This is good so you don’t accidentally send an alert for every event and send 5000 emails every hour!

Activator Condition

That is a quick tour of Real Time Intelligence. I hope I have shown how easy it is to get started with RTI, however there is a lot of depth so I encourage you to have a go. Just playing around with the sample data a bit in a Fabric trial workspace is a great way to learn.

If like me you are interested in RTI because of the DP-700 exam, the most important aspects to remember for the exam are that RTI is for ingestion and analytics of real time streaming events. You need to know that EventStream handles the ingestion part and be aware of the transformation options within EventStream.

If you want to learn more about implementing Real Time intelligence solutions there is this learning path in Microsoft Learn which I would recommend studying before DP-700:

🔗 Implement Real Time Intelligence In Microsoft Fabric

If you need help with Real Time Intelligence or anything Fabric related at Advancing Analytics we are here for you!

Contact us here or find out more about our Fabric offerings here.

 

author profile

Author

Ed Oldham