After working with Azure Websites Web Apps for a long time, I have noticed that every day I use some tips and tricks that surprise people in some way. Normally the tips are nothing spectacullar, but when you use them your daily productivity is enhanced. I’m going to start posting some of these tips in order to have my own “notebook” to revisit when needed, but also to share and make easier small tasks like working with files, website management, getting alerts, etc.
So let’s start with a simple tip:
“How to download the website contents of a site hosted on Azure Web Apps”
There are many ways to answer this question coming to my mind: FTP, MS Deploy, uploading 7zip command line tool via Kudu, ...but is there something more simple than that just for downloading the content?
Answer is YES.
Azure Web Apps ZIP API
After spending some time working with Azure Web Apps, you have probably noticed that “behind” your website, there is a lot of tools being served by the SCM (Service Control Manager, also known as Kudu).
You can direclty access this service by simply browsing the “https://mywebsite.scm.azurewebsites.net” URL, where “mywebsite” is your site name, and then introducing your Azure credentials (using an Azure AD or Microsoft Account). You can also use Basic Authentication by browsing “https://mywebsite.scm.azurewebsites.net/basicauth” and then introduce the deployment credentials you can get from your website settings at the Azure Management portal.
Kudu offers you an user interface full of tools to manage and diagnostic your web app:
And if you dig into the Kudu documentation, you will notice that some REST APIs come out-of-the-box. One of these REST APIs is the ZIP, that allows downloading folders as zip files or expanding zip files into folders:
Download website contents using a simple URL in your browser
Enough! With just introducing this URL in your browser and typing your credentials, you can download your full website contents:
https://mywebsite.scm.azurewebsites.net/api/zip/site/wwwroot
If you want to download a subfolder of your website, you could use something like:
https://mywebsite.scm.azurewebsites.net/api/zip/site/wwwroot/subfolder1
Note that you can download anything inside the “D:\home” folder where the “/api/zip” is relative, so if for example, you want to download all your site log files, including IIS log files, you can use the following URL:
https://mywebsite.scm.azurewebsites.net/api/zip/LogFiles
NOTE: an equivalent one would be to use the “dump” API:
https://mywebsite.scm.azurewebsites.net/api/dump
Adding some Azure PowerShell sauce
Is quite normal to download these log files to your PC and then run your favourite log parsing tool like Log Parser and Log Parser Studio. It’s easy to manually download them from Kudu but it’s not funny when you have to do the same task almost every day over some hundreds of websites.
So why not to use PowerShell to automate the task?
After installing Azure PowerShell, you can run the following script to download the files and folders using the Kudu ZIP REST API. You can tweak it a little by iterating between all your websites, and also iterating all your subscriptions, so you could download the IIS logs of all the websites you own just with some lines of code.
In the following script, I’ve changed the folder to download to a specific one where DNN Platform stores their Log4Net daily logs, that BTW, you can then review on-premises using Log4View.
NOTE: I used PowerShell 5.0 available on Windows 10, with “wget” and “Invoke-RestMethod” support what simplifies the script.
# Input parameters
$subscriptionName = "MySubscriptionName"
$websiteName = "MyWebsitename"
$slotName = "Production"
$folderToDownload = "site/wwwroot/Portals/_default/logs/" # must end with / for folders or you will get a 401
$outputZipFile = "D:\Temp\LogFiles.zip"
# Ask for Azure credentials to obtain the publishing credentials
Add-AzureAccount
Select-AzureSubscription $subscriptionName
# Build the basic authentication header
$website = Get-AzureWebsite $websiteName -Slot $slotName
$publishingUsername = $website.PublishingUsername
$publishingPassword = $website.PublishingPassword
$base64AuthInfo = [System.Convert]::ToBase64String( `
[System.Text.Encoding]::ASCII.GetBytes(( `
"{0}:{1}" -f $publishingUsername, $publishingPassword)))
# Download the log files using wget or Invoke-RestMethod, available in Windows Powershell 5.0 :)
Invoke-RestMethod -Headers @{Authorization=("Basic {0}" -f $base64AuthInfo)} `
-Uri "https://$websiteName.scm.azurewebsites.net/api/zip/$folderToDownload" `
-OutFile $outputZipFile
And don’t forget other Kudu APIs available!
I have some other tips I use in a daily basis I will be posting soon. Don’t forget to take a look to other APIs available, because they are a box of surprises:
- api/vfs: allows you to execute files and folder operations
- api/command: allows you to execute something (one of my favourites when combined with the previous one)
- api/settings: allows you to change the site settings
- api/dump, api/diagnostics, api/logs: for diagnostics, tracing, etc.
- api/scm, api/deployments, api/sshkey: for repository and deployment management;
- api/siteextensions: enable or diable other site extensions like Visual Studio Monaco. The available extensions grow in a monthly basis, don’t forget to revisit from time to time
Un saludo and Happy Coding!