Introduction

In order to work with data in Microsoft Fabric, you first need to get data into it. There are several ways to do this. One way is to use a dataflow. In this tutorial, we will explain step-by-step how to ingest data into a Fabric lakehouse using a dataflow.

Goal

A CSV File should be ingested into a delta table in the lakehouse.

Source

The source data is a CSV file with the name student.csv stored under "Files" in the lakehouse.

Target

The target is a delta table in the lakehouse.

Step 1: Open Data Factory Experience

We have already signed in into Fabric and opened the Data Factory Experience. We have selected the workspace "dlnerds_workspace".

Here you can see the Data Factory Experience.

Step 2: Create Dataflow

We have already created the dataflow "dlnerds_ingest_student". Check out the following post if you want to know how to create a dataflow.

How to create a Dataflow in Microsoft Fabric: A Step-by-Step Guide
Introduction Microsoft Fabric is a powerful All-in-One Data Platform (SaaS) in the Azure Cloud that combines various Azure components to cover the fields of Data Integration, Data Engineering, Data Science and Business Intelligence. One key component of the Microsoft Fabric architecture is the Dataflow. In this tutorial, we will explain

Open the dataflow. This takes you to the Dataflow Gen2 interface.

Step 3: Get Data

First, choose a data source. In order to do this, click on "Get data".

A new window opens. Navigate to the "New" tab.

Since the source data is stored in a lakehouse, select the "Microsoft Fabric" tab and choose "Lakehouse".

You can view this post with the tier: Academy Membership

Join academy now to read the post and get access to the full library of premium posts for academy members only.

Join Academy Already have an account? Sign In