Monday, April 23, 2018

Hybrid integrations with azure components - large file to database on-premise import - Part 2 - create azure ad application and install on-premise datagateway

Before we can access the pipeline created in part 1 we need an azure ad application to get authorized to call the Azure service management API that does contain methods to create a pipeline run and check pipeline run status. We need this access to the Azure service management API in order to trigger our pipeline from a logic app which is our ultimate goal.

(As I wrote in part 1 this might not be necessary for future implementations as I've read that the dev team for logic apps does have an ADF connector in the pipe)

But right now it's needed so let's get started. Instead of going through all the detailed steps on how to create an azure ad application you can follow the guidelines in this Microsoft tutorial and if for some reason the link is removed or becomes obsolete you should be able to find a similar tutorial by googling on 'Azure ad application create tutorial'.

Follow the steps in the tutorial below:

How to create Azure ad application

You'll need to give your Azure ad application access to Azure service management API. You do this by setting required permissions on your ad app. Click on 'Azure Active Directory' -> 'App Registrations' and the app you just created (if it doesn't appear, select 'all apps' in the drop-down list)


Then click on 'Settings' -> 'Required permissions' and the + Add button. Search for 'Windows Azure Service Management API' and make sure that you select the delegated permission 'Access Azure Service Management as organization users (preview)'. Save your app changes.


Finally, in order for our ad app to have access to data factory pipelines, we'll add it as a contributor to the Azure subscription. (One should presumably be able to add this access in a more granular way but for simplicity, we'll give the ad app this high-level permission)

In the search field on top of the Azure portal site, type 'Subscriptions' and wait for the drop down result list to appear. Click on 'Subscriptions' and then click the subscription you're working in (in my case I only got one subscription).


Click on 'Access control (IAM)' -> +Add button and choose Role 'Contributor'. Search for your ad app name in the 'Select' field and finally click save.


Our ad app is now finalized and got enough permission to trigger and monitor our pipeline from a logic app.

However, you'll need some values for part 3 in this tutorial (then we create the logic app that'll trigger our pipeline) So I recommend that you collect these values right now, maybe in some notepad type of application.

  • Tenant ID (Called Directory ID in the portal. I found this value clicking 'Azure Active Directory' TAB -> 'Overview' -> listed on top of overview blade)

  • Client ID - (In the portal it's called ApplicationID and you'll find it via: 'Azure Active Directory' -> 'App Registrations' -> the app you just created (if it doesn't appear, select 'all apps' in the drop-down list))
  • Client secret (this one was generated when you created your ad app and you should have stored a copy)
  • Subscription ID (In the search field on top of the Azure portal site, type 'Subscriptions' and wait for the drop down result list to appear. Click on 'Subscriptions' and then click the subscription you're working in (in my case I only got one subscription).)

  • Resource group name (where your data factory resides)
  • Azure data factory name
  • Azure data factory pipeline name
Finally, in order to access files locally on our on-premises file area from a logic app, we need to install Microsoft on-premises data gateway.

I just googled 'azure on premises data gateway download' and got the download link from Google in the first position. So today you can get it here:


Installation is quite simple. Just follow it.

1 - Installation process - set configuration information.


2 - During the configuration phase of the installation you'll need to log into Azure and unless you use a corporate account or a student account it won't allow you. But there's a workaround as I learned during global integration boot camp. You can create a new user in your azure ad that does belong to your azure ad tenant. So do this and assign this user to the contributor role in your Azure subscription.

3 - Create a gateway artifact in a resource group of your choice in your Azure subscription. Be aware that if you do not have a corporate account or student account (see point 2) you'll need to login to your Azure subscription with the azure ad user created in step 2.


Great, now we have all the artifacts we need to tie everything together in part 3 where we'll create a Logic app that'll use the on-premises data gateway and the azure ad application that we just created to trigger and monitor our pipeline. (part 1)

No comments:

Post a Comment