Import Windows Autopilot devices with Logic Apps

Windows Autopilot lifecycle automation

In one of my previous posts, I shared how I created some Windows Autopilot lifecycle automation with Azure Logic Apps. With that Logic Apps flow, it’s possible to (re-)assign a deployment profile or remove a device from the Autopilot Service. But with that flow, it’s not possible to import our csv files with device serials and hashes. In this post, I describe how we can create some automation with Logic Apps to automatically upload the csv file with Autopilot information.

The solution in short

The Logic App automatically checks a SharePoint Online folder on new files every x minutes. The flow picks up the csv file, removes the first line in the csv, splits up the different lines of data and splits the data by comma. The data is parsed, so we can use the data with Graph API.

The flow first checks if the serial number is an existing serial or a new serial. If it’s a new serial number, the serial number is imported to the Windows Autopilot service by using Microsoft Graph. Because it takes some time to finish importing the serial, I added a loop in the flow. With this, we’re able to get the final import status of the import job, so we can handle that in the flow. Based on the final import status, an email is sent from a shared mailbox to the person who placed the csv file in the folder.

That looks like this in an overview.

Requirements

First of all, we need a folder on a SharePoint site to which users, to who you are delegating the task to import the device info, have access. And the (service) account under which the Logic App flow runs, needs access to the folder.

Via HTTP actions, we’re able to use Microsoft Graph to import the device information to the Autopilot service. These HTTP actions use an Azure App Registration, which is the second requirement. The App Registration is used for authentication in the HTTP actions and also for granting the required permissions.
The minimum required permissions for this Logic App are shown below:
DeviceManagementServiceConfig.ReadWrite.All

To secure the HTTP actions I use an Azure Key Vault, which is described in this previous post.

I used a service account to access the SharePoint folder and to send notifications using a shared mailbox.

Besides that, I used this article from Paul Murana, to get all the steps in place to process the csv data in the flow.

Setup the first part of the Logic App

In the first part of the Logic App flow we setup the trigger, get file metadata and properties and start processing the csv file.

Sign-in to the Azure portal and open the Logic Apps service. Here create a new, blank Logic App.

In the search box for connections and triggers search for SharePoint. Select SharePoint, scroll down (or search again) and select When a file is created in a folder.

Click Sign in and authenticate with your (service) account which has access to the list.

Select the Site address from the drop-down. If the address is not shown, click Enter custom value and enter the address.
Click on the folder icon and select the folder.
Make your choice of how often the Logic App needs to check the folder for new items.

Add the Get file metadata (SharePoint) action to the flow.
Select the site address again.
Under Dynamic content search for File identifier, which is content from the SharePoint trigger. Add this File identifier in the File identifier box.

Add a Get file properties action.
Select the Site address.
Search for ItemId, which is from the Get file metadata action.
Data from this Get file properties action is later used in the flow, to grab the email address of the person who added the csv file to the folder.

The next action is a Data Operations action. Search for the Compose action and add it to the flow.

<br>

<br>

Rename the action to CSV.
As Input,
we use File content from the SharePoint trigger.

As the flow handles the csv file just as one big line of data, we need to split the lines.
Enter a Compose action again and rename it to New Line.
Just hit the ENTER bar to add an empty line to the input box.

With this action, we split the data by line.
Enter a Compose action and rename it to SplitNewLine.
In the Dynamic content popup, enter split()

Next, make sure the cursor is between the brackets and hit Outputs (from CSV). Add a comma direct after the ) sign and hit Outputs from New Line and click OK.
Or just write the expression manually.

Now this should look like this.

As the first line of the csv file contains data we don’t process, we remove it with this next Compose action.

Add a Compose action.
Type skip().
Between the brackets hit Outputs (from SplitNewLine). Type, 1 before the last bracket and click OK.

Rename the action to CSVData.

Our first part of the flow is created!
We imported the data, split the data by line and removed the first line of the csv file.

Split the data and parse to JSON

In the next part we’re going to split the data by comma and parse the data to JSON, so we can use it with Graph.

First add a For each action.
As input we use the output from the CSVData compose action.

Add a Compose action and rename it to SplitByComma.
As input, we add this expression to split the data by comma.

split(item(), ',')

With the next Compose action, we are creating some sort of variables that we can later use in our HTTP trigger. If we don’t create these variables we cannot use each element separately in our flow.
We’re first manually creating an empty JSON string (in a text editor) that holds all variables we want to process from the csv.

{
  "Device Serial Number": "",
  "Windows Product ID": "",
  "Hardware Hash": "",
  "Group Tag": "",
  "Assigned User": ""
}

Add a Compose action and rename it to JSON.
Paste the manually created JSON string in the input box.
Make sure your cursor is between the double quotes and add the Dynamic content Outputs from SplitByComma.

Click on the three dots on the right corner of the Compose action and choose Peek code.
Copy the text @{outputs(‘splitByComma’)
Click Done.

Paste this text in your text editor.
Behind the pasted text enter ?[0]

This expression represents the first element of our data produced in the previously Compose action.

We start from zero and so on. Copy the expression to the other lines and change the numbers.
Copy the whole JSON string.

Delete the existing data in the Inputs box and paste the JSON string in the box.

Save the flow, so we can run the Logic Apps flow for the first time. This will create example data for our next action. Copy a csv file that holds Windows Autopilot device information to the SharePoint folder and run the flow.
When it’s finished open the flow from the Runs history. Browse to the JSON action and open it.
Copy the outputs data so we can use it as example data for our next action.

Switch back to the flow itself to edit the flow.
Add a new Parse JSON action, found under the Data operations.
As input, we take the Outputs from the JSON action.
Click on Use sample payload to generate schema.

Paste the data we previously copied from the runs history and click Done.

Our Parse JSON now looks like below.

This part of the flow is finished. We have parsed the data from the csv file, so we can use it with our Graph queries later in the flow.

Get Windows Autopilot Device Identities

The csv file data still contains an empty last line. This is filtered out by adding a condition in the flow.
This is followed by running our first HTTP action against Microsoft Graph. This is used to check the Autopilot service if we’re uploading an existing serial number or a new one.

First add a Condition, which is found under Control, to filter out the empty rules of data.
As the first value, we use the Device Serial Number from the Parse JSON action. Choose is not equal to from the drop-down list. Leave the second value empty.

We leave False empty, so the flow stops the data processing for empty lines.
Under True, we add a Get Secret action, which is a Key Vault action.
Sign-in and enter the name of the secret.

We now add our first HTTP action.
As Method select GET.
As URI enter:

https://graph.microsoft.com/beta/deviceManagement/windowsAutopilotDeviceIdentities?$filter=((contains(serialnumber,'[SERIAL]')))

Replace [SERIAL] with Device Serial Number from the Parse JSON action.

Check Authentication and choose Active Directory OAuth as Authentication type. Enter the Tenant, Client ID (from the app registration) and Audience.
And make sure to select the Key Vault secret value from the Key Vault action.

Next, add a Parse JSON action.
As input, we use the Body from the previous HTTP action.

To fill the schema with the correct information, we can add an example payload. The information to use as an example payload can be ‘generated’ by performing the same GET query (as in the previous HTTP action), but this time using Graph Explorer. Copy the response in Graph Explorer like the response below.

<br>

<br>

Switch back to the Logic app and click Use example payload.
Paste the Graph explorer response in the text box and click Done.

Add a switch and finish the ‘serial already exists’ case

By adding a switch to our flow, we can create two cases based on the output of the previous HTTP trigger.
If the serial number is already added previously to the Windows Autopilot service, we get a response of odata.count = 1. If it’s a new serial number, we get a response of odata.count = 0. With this information, we can split our flow into two cases. If it’s an existing serial, we finish the flow by sending out an email from a shared mailbox, to inform the requestor the serial is already added to the Autopilot service.

Add a Switch to the flow.
In the On box select @odata.count from the last Parse JSON.
Add two cases. In one Equals box enter 1 and in the other box enter 0.

In the Existing case add an Send an email from a shared mailbox action, found under Office 365.

Fill in the email address from the shared mailbox as Original Mailbox Address.
Search for Created By Email as Dynamic content to add into the To box. This retrieves the email address from the initiator.
You can enter your own text in the Body field. You can also use dynamic content, like the Device Serial Number.

Finish the Logic app flow

Under the new case add a HTTP action. With this action we’re importing the Autopilot device data.
As Method select POST.
As URI enter:

https://graph.microsoft.com/beta/deviceManagement/importedWindowsAutopilotDeviceIdentities/

In the Body field enter:

{
  "serialNumber": "",  
  "hardwareIdentifier": "",
  "groupTag": "",
  "assignedUserPrincipalName": ""
}

On every line between the quotes, we’re adding dynamic content from the first Parse JSON action.

Add the required dynamic content like in below example.
As in the previous HTTP action, select the Authentication type, enter the Tenant, Audience and Client ID. And make sure to select the Key Vault secret from the Key Vault action.

Add a Parse JSON action. As Content, we add Body from the previous HTTP action.
As sample payload, we can run the flow as we did in a previous step. The output from the previous HTTP found in the runs history can be used as example payload.

As importing Autopilot device information can take a couple of minutes and we want to process the import status after the import is finished, we add an Until action to the flow with a Delay in it. The import status is at first, unknown. This until action works as a loop until the import status is not equal to unknown anymore.

Add the Until action.
Don’t fill in the value yet, as we will add an HTTP GET action in this until action to get the deviceImportStatus.
You can change the limits of the until action. I leave it default.

Enter a Delay action. I set the delay to 15 seconds. Every 15 seconds the whole Until flow part is run.

Add an HTTP Action.
Choose GET as Method.
As URI enter:

https://graph.microsoft.com/beta/deviceManagement/importedWindowsAutopilotDeviceIdentities/[id]

Replace the [id] with ID from the previous Parse JSON POST action.

Select the Authentication type, enter the Tenant, Audience and Client ID. And make sure to select the Key Vault secret from the Key Vault action.

Add a Parse JSON action.
As Content, we use the Body from the HTTP GET action.
As schema, we need to use the output from the previous HTTP GET action. You can use an example again, by running the flow or use Graph Explorer to run the HTTP GET query. Paste the example in the action via Use example payload.

Now that the HTTP action with the Parse JSON is in place, we can use the output from the Parse JSON in the Until action itself.
As value on the left, we choose the dynamic content deviceImportStatus from the last Parse JSON action.
The first part of the new case flow with the Until loop looks like below.

Close the Until action and below this action add a Condition.
We use the final output from the HTTP GET action, to get the final status of the import. If the deviceImportStatus gives a complete status back, the import is successful. Anything else is a failure. With the Condition, we can split up our flow and handle the outcome with a status email to the initiator of the flow.

Add a Condition.
As value on the left, enter the dynamic content deviceImportStatus from the last HTTP GET action.
Choose is equal to and as value on the right fill in complete.

I added a Send an email from a shared mailbox action to this part of the flow, for true and for false. The only difference is the body text and subject, to inform the initiator about the final status.

With this Send an email actions, the flow is finished and we have some error handling build in.

Thanks for reading and if you have any question, drop me a comment below.




Be the first to comment

Leave a Reply

Your email address will not be published.


*