Setup Azure Files for IIS

To setup a share in Azure Files straight forward, using a few lines of PowerShell makes it easy.

Assuming you applied to the Azure Files Preview program and received the notification email by the azure team that Azure Files is available for your subscription.

Azure Files will be available for each storage account you create after Azure Files became available.

image

After you’ve created a new storage account, Azure Files is configured and pops up in the service list in the storage dashboard in the Azure portal.

Create a share in PowerShell

It needs two steps to create a share in PowerShell. First you need to setup a storage account credentials object. Then take the credentials object and the name of the share to call the New-AzureStorageShare cmdlet.

param (
    [string]$StorageAccount="",
    [string]$StorageKey="",
    [string]$ShareName=""
)
$ctx = New-AzureStorageContext -StorageAccountName $StorageAccount -StorageAccountKey $StorageKey
$share = New-AzureStorageShare $shareName -Context $ctx

To run the script above you need your storage account, the access key, you can find in the keys section of the storage dashboard in the Azure portal, and the name of the share you want to create.

If that’s done your Azure files share is generally available. The easiest way to access the share is to run a net use command from the command line, like:

net use z: \\<account name>.file.core.windows.net\<share name> /u:<account name> <account key>

The command above maps the UNC path \\<account name>.file.core.windows.net\<share name> to a drive named z:. File and directories will be available from this moment on by z:\<share name>

Unfortunately, it’s not possible to access files like this in IIS. Instead, you will receive errors like Authentication failure, Web.Config not accessible or simply a page not found error.

Also, you can’t configure ACLs for the share, what makes it difficult to allow the app pool identity to access the shared directories.

How to prepare the IIS application pool

Let’s say we have a web application that first needs to create images on the share. Also clients should be able to request those images by http.

1) First create a new local system user using the computer management snap-in. Assign storage account name and storage key as password to the new user.

2) Now make the new user the app pool identity of the application pool in IIS.

3) Setup a virtual directory that points to the UNC path of the shared directory.

4) Open the Edit Virtual Directory dialog (select Basic Settings… in right pane), click on Connect as.. and select the new user to connect.

image

5) In the Connect as… dialog configure the specific user and take the same user credentials as for application pool. (I tried the Application user (pass-though authentication) option, but was unlucky to access the files in the browser).

image

Now, you should be able to read/write files to the share from the application and browser clients should also be able to stream resources from the share.

Advertisements

Using Microsoft Azure File Service

Almost a year ago Microsoft introduced a file share technology called the Microsoft Azure File Service. Although it’s still in preview and you need to apply to it from within the Azure portal it offers the most convenient (and reliable) way to share directories and files between virtual machines in Azure. The Azure File Service adds powerful flexibility to the IaaS platform, especially when your application is highly scalable and needs to process and share a lot of files in a reliable manner.

Setup

To setup a share you need to activate the Azure File Service. The best way to do that is to use the new Azure portal and sign up for the Azure File Service preview program. This can be done in the Azure Storage area. I received the sign up approval a few minutes later and the File Service was unlocked. That enabled me to select a file endpoint when I create a new storage.

Once the new file storage account is configured it’s available for VMs, worker- and web roles, as long as there located in the same region where the storage account is hosted.

To access Azure Files from a virtual machine run the following command in a command prompt on the box:

net use z: \\<account name>.file.core.windows.net\<share name> /u:<account name> <account key>

That’s it. Now the user can access files and folders on the share drive. Account-, name and key and share name is available in the portal.

Allowing IIS to access Azure Files

I needed to grant the IIS application pool user as well as the IUSR access to the share. That was not that easy, because you can’t configure the ACL directly for the share.

I could create a Windows user and gave it the storage account name and password, though. I assigned those credentials to the app pool identity and created a virtual directory in IIS, to which I used the same Windows user again to connect. I couldn’t access files using the drive letter path, but it worked pretty well using the UNC path \\<account name>.file.core.windows.net\<share name> instead.

Stopping Azure VMs without loosing your IP address

Stopping a virtual machine in Microsoft Azure can be tricky, if you need to rely on the IP address assigned to the virtual machine. Because Azure releases the IP address if you stop the machine by default.

To avoid loosing the IP address you can use the Stop-AzureVM cmdlet together with the -StayProvisioned switch. By using this switch you are basically telling Azure that you like the VM to be stopped but keep it provisioned. This prevents Azure from releasing the IP address.

Example:

Stop-AzureVM–ServiceName 'cloud service name' -Name 'virtual machine name' –StayProvisioned

How to disable the BGInfo extension in Azure VM Agent

BGInfo is a very useful tool by SysInternals that collects machine configuration data and renders an image out of it to present it as background image of the Windows Desktop.

If a new virtual machine is created, the current Microsoft Azure platform adds a light weight process, called VM Agent, to the machine. The idea behind the VM Agent is to provide extensibility to the virtual machine. That makes it easy for the Azure platform to add other solutions to machine. One extension for instance allows you reset the default user password, what was impossible earlier. Read more about the VM Agent and VM Extensions here.

The VM Agent Extension manager also contains BGInfo and enables it by default.

You can either disable the VM Agent itself, by unticking the option in the Virtual machine configuration on the Azure management website or by using powershell cmdlets, read more here.

To disable just a specific extension, here BGInfo, run the following powershell command:

Get-AzureVM –ServiceName ‘CLOUD_SERVICE_NAME’ –name ‘VM_NAME’ | Set-AzureVMBGInfoExtension -Disable -ReferenceName "BGInfo" | Update-AzureVM

Replace the cloud service and virtual machine name by yours.

PowerShell Windows remote management of virtual machines in Azure

Working remotely on virtual machines in Windows Azure using PowerShell is something I do quite often. The first thing I do after I created a new virtual machine is in fact to enable Windows Remote Management (WinRM). Enabling WinRM makes it possible to connect from you local PowerShell to a PowerShell session on the target machine, just like an SSH session in the Linux world. You also can register PowerShell script blocks, stored in external files, and get them executed on the target machine. Because WinRM in the Windows Azure world demands an SSL/TLS connection the communication is secured.

Adding a virtual machine endpoint for PS remoting

If it’s not created already you need to create en endpoint for the virtual machine to allow PowerShell remoting. PS remoting uses the TCP port 5986, so the definition for the Azure endpoint that allows HTTPS secured PowerShell remote sessions might look like this:

get-azurevm $mySvc $vmName | Add-AzureEndpoint -Name PS-HTTPS -Protocol TCP -LocalPort 5986 -PublicPort 5986 | Update-AzureVM

Enabling PowerShell remoting on the target machine

If you are dealing with a virtual machine that has an older versions of Windows 2012 R2 installed you need to run the following command:

Enable-PSRemoting
This command starts the windows service, sets the startup type to automatic and assigns a (HTTP) listener endpoint to all addresses the vm is providing.
Microsoft made it much more easy for us with Windows 2012 R2, because since then the WinRM service is running by default.
 

Adding an HTTPS listener to WinRM

By default WinRM installs an HTTP listener, what is less secure. In Windows Azure we need to secure the communication using SSL/TLS. To enable that it’s necessary to add the appropriate listener on the target machine. This HTTPS listener needs to be bound to a server certificate, what can be self signed or assigned by the usual authorities on the internet.
If you have the certificate installed you can run following command to add the listener in a standard command shell on the target machine:
 
winrm create winrm/config/Listener?Address=*+Transport=HTTPS @{Hostname="HOSTNAME.cloudapp.net";CertificateThumbprint="THUMBRPINT"}
You need to replace the HOSTNAME by the host name of the target machine and the THUMBPRINT by the thumbprint value of the server certificate.
 
Now the virtual machine is ready to accept PowerShell remoting.
 

Open a Powershell Remote session using SSL and a credentials object

To open a remote session on the target machine in PowerShell we need to create a credentials object based on a username and password pair of the remote machine. We use a secure string object in PowerShell to handle the password securely.
 
$myPwd="PASSWORD" 
$username="USERNAME"
$password = ConvertTo-SecureString $myPwd -AsPlainText -Force
$cred= New-Object System.Management.Automation.PSCredential ($username, $password)
Enter-PSSession -ComputerName HOSTNAME.cloudapp.net -Credential $cred -UseSSL
First we declare two variables for the password and username. Then we convert the plain text password into a secure string representation. Based on the secure password and the username we create a PowerShell credential object.
Finally, the Enter-PSSession command connects the remote machine and switches the context of the your PowerShell to the remote session just like SSH does on a Linux system.
 

Running a remote script using SSL and a credentials object

Running a script on a remote machine needs the same setup. We also need to provide a credentials object. The command can be a single line of code or a whole PowerShell script stored in a external file.
 
$password = ConvertTo-SecureString $myPwd -AsPlainText -Force
$cred= New-Object System.Management.Automation.PSCredential ($username, $password )
$computername= "HOSTNAME"
$cmdAutoAdmin = { New-ItemProperty -Path 'HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon' -Name AutoAdminLogon -Value "1" }
invoke-command -ComputerName $computername -Credential $cred -UseSSL -ScriptBlock $cmdAutoAdmin

This example uses the invoke-command to patch the a registry value on the remote machine. First, the command is stored in the cmdAutoAmdin value and then invoked on the remote machine.

How to accept connection loss and SQL client exceptions in SQL Azure and still live happily ever after

Dealing with a failover cluster like a database hosted in SQL Azure implies unfortunately to deal with connection issues, just because one is not connected to a single instance but to a bunch of typically three SQL server instances. Thus connection requests are routed by a load balancer to one or the other instances.

From time to time (quite often to be honest) instances in SQL Azure are switched off and replaced by new instances for maintenance reasons. This is when your client might receive connection timeout exceptions. Here are a few of the exceptions I received:

  • The client was unable to establish a connection because of an error during connection initialization process before login. Possible causes include the following: the client tried to connect to an unsupported version of SQL Server; the server was too busy to accept new connections; or there was a resource limitation (insufficient memory or maximum allowed connections) on the server. (provider: TCP Provider, error: 0 – An existing connection was forcibly closed by the remote host.)
  • A transport-level error has occurred when sending the request to the server. (provider: TCP Provider, error: 0 – An existing connection was forcibly closed by the remote host.)
  • The service is currently busy. Retry the request after 10 seconds.
  • Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
    The cause of those exceptions, an instance that is not available anymore for instance, is actually pretty reasonable and in fact it is something that one need to have in mind when interacting with a cluster instead of a single server.

But how to cope with it?

Transient Fault Handling

Because I wasn’t the first who was facing this problem I found a pretty good solution based on the Microsoft patterns and practices guide, called Transient Fault Handling.

The basic idea of this solution is that there are differences in the exceptions you receive when talking to SQL Azure. Some are transient and though valid to give it another try and some simply aren’t. If you are facing a connection issue you want to try it again for a few times, a few seconds later. A syntax error, somewhere in the SQL query, on the other hand wouldn’t make sense to get tried again.

Nuget packages

All you need is to install the Transient Fault Handling Core package, if you need only Linq support, or for the Enterprise Library 5.0 Transient Fault Handling Application Block package. Just search for TransientFault in the nuget package manager console.

ItransientErrorDetectionStrategy

After the reference(s) are added you can implement the ItransientErrorDetectionStrategy interface. Which provides exact one method, named IsTransient.

There are different sample implementations on the web, some deal with SqlExcpetion error numbers, some evaluate the exception message text. I don’t prefer parsing the exception text, it seems odd to me, because those messages might be localized.

An implementation might look like this.

Public Class MyRetryStrategy
    Implements ITransientErrorDetectionStrategy

    Public Function IsTransient(ex As Exception) As Boolean Implements ITransientErrorDetectionStrategy.IsTransient
        If Not ex Is Nothing AndAlso TypeOf ex Is SqlException Then
            Dim sqEx As SqlException = CType(ex, SqlException)
            If Not sqEx Is Nothing Then
                Select Case sqEx.Number
                    Case 40197, 40501, 10053, 10054, 53, 10060, 40613, 40143, 233, 64, -2
                        Return True
                End Select
            End If
        ElseIf TypeOf ex Is TimeoutException Then
            Return True
        End If
        Return False
    End Function

End Class

A much more thoroughly implemented version can be found here.

Dealing with numbers on the other hand is tricky too, because native SQL server errors are wrapped by the .Net framework and can be provided by different sources.

I decided to check the most common error numbers, but log every exception that is checked by the IsTransient method. If the error number was not in my transient exception list, I can still add it later.

Retry aware execution

The execution of a retry aware linq statement is quite easy actually. First you need to define a RetryPolicy based on your RetryStrategy. You’d also add the amount of retries and a timespan when to start the next retry in the constructor.

Dim myRetryPolicy = New RetryPolicy(Of MyRetryStrategy)(5, TimeSpan.FromSeconds(3))

Now, that we have created a policy we can take the actual linq query and wrap it into a retry policy Action. The policy ExcecuteAction method executes consumes the linq statement as a lambda function and executes it, but now retry aware, in other words it wraps an extra try catch block around it.

e.Result = myRetryPolicy.ExecuteAction(
        Function()
            Using db = DBHelper.GenerateDataContext()
                Dim src = (From a In db.tblAccounts)
                If src Is Nothing Then
                    Throw New ArgumentException("src")
                End If
                Return src.ToList
            End Using
        End Function)

An exception, thrown inside the action will be caught by the retry policy handler and forwarded to the retry detection strategy implementation. If the exception is transient, the action will be executed again until the execution was successful. If the exception wasn’t considered as transient or the retry limit was hit, the original exception is finally thrown.

Conclusion

This enterprise application block of Microsoft’s Best Practices team is a pretty straight forward solution, easy to implement and exactly what I was looking for to work reliably with SQL Azure databases.