Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
This tutorial shows how to create your first ontology (preview) in Microsoft Fabric, either by generating it from an existing Power BI semantic model or by building it from your OneLake data. Then, enrich the ontology with live operational data and explore it with both graph preview and natural-language (NL) queries with a Fabric data agent.
Important
This feature is in preview.
The example scenario for this tutorial is a fictional company called Lakeshore Retail. Lakeshore is a retail ice cream seller that keeps data on sales and freezer streaming data. In the tutorial, you generate entity types (like Store, Products, and SaleEvent), bind streaming data (like freezer temperature) from Eventhouse, and answer questions like: "Which stores have fewer ice cream sales when their freezer temperature rises higher than -18°C?"
Choose scenario for creating ontology
This tutorial contains two options for setting up the ontology (preview) item: automatically generate it from a semantic model, or manually build it from OneLake data.
Choose your preferred scenario by using the selector at the beginning of the article.
Prerequisites
- A workspace with a Microsoft Fabric-enabled capacity. Use this workspace for all resources created in the tutorial.
- Ontology item (preview), Graph (preview), XMLA endpoints, and Data agent item types (preview) enabled on your tenant.
Fabric administrators can grant access to ontology in the admin portal. In the tenant settings, enable Ontology item (preview), Graph (preview), XMLA endpoints, and Data agent item types (preview).
For more information about these prerequisites, see Ontology (preview) required tenant settings.
Download sample data
Download the contents of this GitHub folder: IQ samples.
It contains the following sample CSV files. The data contains static entity details about the Lakeshore Retail scenario and streaming data from its freezers.
- DimStore.csv
- DimProducts.csv
- FactSales.csv
- Freezer.csv
- FreezerTelemetry.csv
Prepare the lakehouse
First, create a new lakehouse called OntologyDataLH in your Fabric workspace (make sure the checkbox for Lakehouse schemas (Public Preview) is not enabled).
Then, upload four sample CSV files to your lakehouse, and load each one to a new delta table. These files contain entity details about business objects in the Lakeshore Retail scenario.
- DimStore.csv
- DimProducts.csv
- FactSales.csv
- Freezer.csv
- (NOT FreezerTelemetry.csv. This file is uploaded to Eventhouse in a later step.)
For detailed instructions on loading files to lakehouse tables, see the first three sections of CSV file upload to Delta table for Power BI reporting.
The lakehouse looks like this when you're done:
Prepare the Power BI semantic model
This section prepares you to generate an ontology from a semantic model. If you're not following the semantic model scenario and you want to build the ontology manually from OneLake, use the selector at the beginning of the article to change to the OneLake scenario.
From the lakehouse ribbon, select New semantic model.
In the New semantic model pane, set the following details.
- Direct Lake semantic model name: RetailSalesModel
- Workspace: Your tutorial workspace is chosen by default.
- Select or deselect tables for the semantic model. Select three tables:
- dimproducts
- dimstore
- factsales
- (NOT freezer. This entity is created manually in a later step.)
Select Confirm.
Open the semantic model in Editing mode when it's ready. From the ribbon, select Manage relationships.
In the Manage relationships pane, use the + New relationship button to create two relationships with the following details.
From table To table Cardinality Cross-filter direction Make this relationship active? factsales, select StoreIddimstore, select StoreIdMany to one (*:1) Single Yes factsales, select ProductIddimproducts, select ProductIdMany to one (*:1) Single Yes The relationships look like this when you're done:
Select Close.
Now the semantic model is ready to import into ontology.
Prepare the eventhouse
Finally, follow these steps to upload the device streaming data file to a KQL database in Eventhouse.
- Create a new eventhouse called TelemetryDataEH in your Fabric workspace. A default KQL database is created with the same name. For detailed instructions, see Create an eventhouse.
- The eventhouse opens when it's ready. Open the KQL database by selecting its name.
- Create a new table called FreezerTelemetry that uses the FreezerTelemetry.csv sample file as a source. For detailed instructions, see Get data from file.
The KQL database shows the FreezerTelemetry table when you're done:
Next steps
Now your sample scenario is set up in Fabric. Next, create an ontology (preview) item, either by generating it automatically from a semantic model or building it manually from the OneLake data source.
Both scenario options are available in the next step, Create an ontology.