Categories: Fabric

Execute Fabric Data Pipeline from Azure Data Factory

In the blog post Call a Fabric REST API from Azure Data Factory I explained how you can call a Fabric REST API endpoint from Azure Data Factory (or Synapse if you will). Let’s go a step further and execute a Fabric Data Pipeline from an ADF pipeline, which is a common request. A Fabric capacity cannot auto-resume, so you typically have an ADF pipeline that starts the Fabric capacity. After the capacity is started, you want to kick-off your ETL pipelines in Fabric and now you can do this from ADF as well.

Prerequisites

You obviously need an ADF instance and a Fabric-enabled workspace in the Fabric/Power BI service. In Fabric, I created a very simple pipeline with a Wait activity:

To authenticate with the Fabric REST API, we need either a service principal or the managed identity of the ADF instance (the last one is preferred as it doesn’t have a secret that can expire). You then need to create a security group in Azure Entra ID and add the SP or the managed identity to that group. In the Fabric Admin portal, you need to enable the Fabric REST APIs and add the security group.

Once that’s done, you can add the SP or the managed identity to your workspace as a contributor.

Trigger the Pipeline from ADF

The endpoint that we’re going to use is Run on demand pipeline job. It has the following URL:

https://api.fabric.microsoft.com/v1/workspaces/{workspaceId}/items/{itemId}/jobs/instances?jobType=Refresh

We need two parameters:

  • the workspace ID
  • the ID of the Data Pipeline in Fabric (unfortunately it’s not possible to trigger a pipeline by name)

Both can be found in the URL when you have the pipeline open in the browser:

I added both as a parameter of my ADF pipeline:

To call the Fabric REST API we will use a Web activity. For the URL I used the following expression:

@concat('https://api.fabric.microsoft.com/v1/workspaces/',pipeline().parameters.workspaceID,'/items/',pipeline().parameters.pipelineID,'/jobs/instances?jobType=Pipeline')

The method is set to POST, and I used the following body:

{"executionData":{}}

However, if you need to pass along parameter values to the Fabric pipeline, you can do that in the JSON body. You can find an example here.

If you use service principal authentication, you’ll need the app ID of the SP, and the secret (which you stored in Azure Key Vault of course):

If you use the managed identity of ADF, the configuration is a bit easier:

Once the config is done, you can debug the ADF pipeline. The default behavior is to call the REST API synchronously, which means the Web activity will run for the entire duration the Fabric pipeline is running.

You can verify the Fabric pipeline was triggered in the Fabric monitoring view:

If you don’t want a synchronous execution (a running activity in ADF costs money), you can select the Disable async pattern option in the advanced settings (which is a really confusing name).

The pipeline will then finish almost instantly. However, it will always succeed, even if the Fabric Data Pipeline fails.

I’m quite happy with this new functionality as it makes working with Fabric a bit easier and more mature.


------------------------------------------------
Do you like this blog post? You can thank me by buying me a beer 🙂
Koen Verbeeck

Koen Verbeeck is a Microsoft Business Intelligence consultant at AE, helping clients to get insight in their data. Koen has a comprehensive knowledge of the SQL Server BI stack, with a particular love for Integration Services. He's also a speaker at various conferences.

View Comments

  • Bonjour

    je n'arrive pas à touver le panneau admin de fabric, j'ai celui de power bi mais pas fabric, avez vous une idée ? car je ne vois pas "Service principal can use Fabric api 's

    merci d'avance

    • Hi Jean-Philippe,

      are you a Fabric Administrator? (Global admin, not admin of a workspace).

      Koen

Recent Posts

dataMinds Connect 2025 – Slides & Scripts

You can find all the session materials for the presentation "Indexing for Dummies" that was…

3 days ago

Cloud Data Driven User Group 2025 – Slides & Scripts

The slidedeck and the SQL scripts for the session Indexing for Dummies can be found…

2 weeks ago

Retro Data 2025 – Slidedeck

You can find the slides of my session on the €100 DWH in Azure on…

3 weeks ago

Secure Logic Apps with OAuth Authorization

I've used Logic Apps a couple of times over the past years for simple workflows,…

3 weeks ago

Free Online Sessions – Building the €100 DWH and Indexing for Dummies

I'm giving two online sessions soon on virtual events that are free to attend. The…

1 month ago

How to Install SQL Server 2025 RC0 on an Azure VM

I wanted to try out the new JSON index which is for the moment only…

1 month ago