All of us at
some point, be in our educational institutes or in professional world have used
the file share, i.e. the common storage space maintained by IT team of an
organization where users can create their folders, access common software
installers or keep project specific documents. Usually we mount this storage
space as a network drive for easy and quick access.
This article
describes the details of below topics
- Basics of Azure file storage
- Creation of file storage service
using Azure portal
- Mounting file share as drive
- Accessing file share using
storage client libraries.
- Setting SAS policies on content
in file share using storage client libraries.
Basics of Azure File Storage
Azure file
storage is an offering of Microsoft Azure is an equivalent to SMB file share. The
legitimate question to be asked here is since I can anyway have SMB file share
implementation on premise, why would I go with Azure file storage? And here are
your reasons
- You can
quickly migrate legacy applications relying on file share.
- Avoid costly
re-writes.
- Azure file
storage becomes your central share which can be consumed by Virtual machines
running in Azure, Cloud services, any on premise clients with SMB protocol.
- Better
control and more built in accessibility options available. E.g. System I/O
APIs, REST API, Client libraries, PowerShell cmdlets.
- Easy
integration with Linux.
- Better
scalability and performance targets.
- Simplified
hosting for high availability workload data
Note that as
of today, file storage doesn’t support active directory based authentication
mechanism to validate access requests. Instead it uses access key based
mechanism of storage account. If you need to have more control on stored content,
then you should use shared access token as an alternative.
What is SMB
The file
shares are typically driven by a protocol called as SMB. SMB protocol is Server
Message Block, which is nothing but a mechanism to provide shared access to
files, printers, and serial ports over the network.
Microsoft’s
implementation of this protocol is called as Microsoft SMB. It was introduced
with Windows vista operating system as SMB 2.0 and revisions of the same were
made later in Windows 7 with subsequent major revisions of 2.1 and 3.0 as of today.
Concept of Azure File Storage
Let’s try to
understand the concept of Azure file storage
Storage Account
Basically it
is a subset of azure storage services and hence we will need an azure storage
account.
Share
Shares can
be considered as logical representations of the drives which you can map. It is
a container of directories and you can create unlimited number of shares within
a storage account and each share can store maximum of 5 TBs of data.
Directory
These are
nothing but the folders you can create within a file share. It is an optional
entity in the hierarchy.
File
You can
store any number and any type of files in a file share. Each share has its
quota limit and can be maxed up to 5 TBs. Maximum file size than can be stored
is 1TB.
The
hierarchy can be visualized as below
As shown in
the image above, storage account contains two file shares. i.e. Share 1 and
Share 2. Each share has two directories and each directory is containing some
files.
The Url of
the file stored within a directory or in a share is formed in a following
format
https://<storage account>.file.core.windows.net/<share>/<directory/directories>/<file>
Fail-over and Backup
For any
storage solutions, one of the key criteria is to make sure that data is not
lost even in disaster. File share being an azure storage offering, it does
follow all disaster recovery and failover standards and mechanisms as of
storage account however there is slight difference. What is it?
As of
writing this article, the file share witness and RA-GRS (Read Access – Geo
Redundant Storage) is not supported for file share.
About back
up, though there is no official way to back up the azure file share there is
always a workaround. You can automate the process of backup with the help of AzCopy and backup your file share content to blob storage or install
back up agents like Cobian to set up your full and incremental back up
procedures of file share mapped drive.
Creating File Share using Azure Portal
Let’s go
ahead and set up and azure file share using azure portal. It is a quite
straightforward process and starts with creation of a storage account.
Let’s create
a storage account with name demofsstorageaccount
within a resource group which I created earlier. You can choose to create
new resource group if you don’t have one. Once you are done filling all
required information, hit create button and it will submit the creation job to
azure.
As this is a
demo storage account, I have set its type as locally redundant. To Read more
about what all types you can select and what their significances are, it is
recommended that you should go through this link at least once.
Once the
storage account gets created, browse to it and click on the Files section as
selected in image below
Click on the
Add file share button at the top of the File service panel. It will open up
another panel where you can give name to the file share and declare its quota.
Let’s give
file share name as “sharedfiles” and
quota limit as 1 GB, it immediately creates the share for you which looks like
below
Take a note
of the highlighted buttons which are quite self-explanatory. We will take a
detailed look at connect button later in this article.
Before we
add some files, let’s add a directory in file share and which will act as
folder for our files. Name it as “Shared
Content”
Click on the
upload button and it opens up a panel with typical file upload control with multiple
selection enabled. You can browse files that needs go on file share and select
start upload button in header of the panel. We can see that directory and files
are being shown in the file share now.
Select any
file and click on properties button in the header. You will be shown URL of the
file which follows the same format as mentioned above in this article
Note that
there is ETAG associated with each file meaning that the files will be cached
depending on the change in contents of the file. The cached copy of the
document will be sent down to client every time when it is requested.
Let’s try to
update our excel file by deleting it from share and uploading a new copy with
same name. Observe the change in the ETAG
Mapping File Share as Network Drive
As mentioned
before, we can map the file share as network drive so that content on it
becomes easily accessible and you can browse through different directories
quickly.
It is quite
straightforward process and I am sure many of you might have done it already
but even if you haven’t, here are simple steps to do it.
Before
we go ahead and map it as local drive, there are certain things you need to be
aware of
- The client
device should support SMB 3.0
protocol (Windows 8 and above OS) and port
445 (TCP outbound) is open.
- If you
mapping drive on windows virtual machine hosted on azure in a same subscription
and same region as of your file share service then the traffic between Azure VM
and file share will be free, else you would be charged for the traffic as
external bandwidth.
- For Windows 7 devices, though they support
SMB 2.1 but access from outside of azure is restricted due to lack of channel
encryption in SMB 2.1, however SMB 2.1 is supported if you are accessing the
share within Azure.
Mapping can
be done in multiple ways, one is using command prompt and other is using UI on
windows devices.
Remember the
Connect button mentioned above in this article? it shows simple instructions
how you can map your file share as network drive using command prompt.
Open “My
Computer” (Quickest way to launch it Windows key + “E”) and select the option
from the top action links which says Map Network Drive.
Name the
drive and enter path of the file share which we just created.
Since Azure
file share doesn’t support any other authentication mechanism than typical
storage key as of now, select both checkboxes i.e re-connect on logon and
connect using different credentials.
Once you are
done, you will be asked for credentials. Make
sure you enter your storage account credentials in the authentication
dialog. You can get the primary or secondary storage key of your account in
storage account Settings > Access keys.
After
successful authentication, you should be able to see the mapped drive in your
explorer.
Accessing File Share using Client Libraries
Before we
start this, make sure that you are using 5.x + version of storage client
assemblies. Also note that Azure emulator currently do not support File share
so make sure that you are pointing to correct file share service on azure.
Let’s create
a console application which will access and download the files in a file share
which we have recently created. Create new console application project in
visual studio and install “WindowsAzure.Storage
-Version 6.2.0” nugget on it.
The sample
code to access and download the file is as below
Note – The
code below is just for the demo purpose and may not be the best performing
code.
It connects
to the file share and verifies if it exists and then proceeds, same is done for
getting connected to directory and file to be downloaded. It downloads the file
to the local file system.
Note that it
reads the connection string of the storage account from the application
configuration file so to run the code below as-is, you will need to add the key
with same name to your app.config and set it’s value to connection string of
your storage account (which can be easily found on azure portal by going in the
Settings > Access Keys)
class Program
{
static void Main(string[] args)
{
try
{
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConfigurationManager.AppSettings["StorageAccountConnectionString"]);
CloudFileClient fileClient =
storageAccount.CreateCloudFileClient();
CloudFileShare fileShare =
fileClient.GetShareReference("sharedfiles");
if (fileShare.Exists())
{
CloudFileDirectory rootDirectory =
fileShare.GetRootDirectoryReference();
if (rootDirectory.Exists())
{
CloudFileDirectory customDirectory =
rootDirectory.GetDirectoryReference("Shared Content");
if (customDirectory.Exists())
{
CloudFile file = customDirectory.GetFileReference("The Word.docx");
if (file.Exists())
{
Console.WriteLine("Downloading file..");
DownloadFileFromShare(file, @"D:\Downloaded_The_Word.docx");
}
}
}
}
}
catch (Exception ex)
{
Console.WriteLine("Error: " + ex.Message);
}
finally
{
Console.WriteLine("Enter to exit..");
Console.ReadLine();
}
}
private static async void DownloadFileFromShare(CloudFile file, string saveToPath)
{
await
file.DownloadToFileAsync(saveToPath, System.IO.FileMode.OpenOrCreate);
}
a
Setting SAS policies on content in file share
As mentioned
in the article above, we can set up SAS policies using storage client libraries
for better control over the content, let’s see how it can be done.
This article
assumes that users have basic understanding of SAS and related policies and so
will only focus on setting these policies on file share contents. If you do not
know about SAS, you can read more information here.
The basic
idea is, when you don’t trust your storage clients and still you want to
provide access to resources in storage then you can achieve this by providing
shared access signature token and let clients access the resource for limited
period. You can optionally define the access policy and generate token from the
policy which we will do in our example below. The benefit you get out of this
is, you don’t have to share the primary or secondary key of your storage
account to the end users which ultimately provides them the administrative
access to your storage account.
In our
sample code below what we will do is, we will define a SAS policy with read
only permissions on the file share which we created. We will try to perform the
write operation on file share i.e. by creating a new file in file share, it is
expected that the code should run with an error as we don’t have any write
permissions yet.
If we run
code below as-is then we should get error like this
class Program
{
static void Main(string[] args)
{
try
{
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConfigurationManager.AppSettings["StorageAccountConnectionString"]);
CloudFileClient fileClient =
storageAccount.CreateCloudFileClient();
CloudFileShare fileShare =
fileClient.GetShareReference("sharedfiles");
if (fileShare.Exists())
{
string policyName = "DemoPolicy" + new Random().Next(50);
FileSharePermissions fileSharePermissions =
fileShare.GetPermissions();
// define policy
SharedAccessFilePolicy sharedAccessFilePolicy = new SharedAccessFilePolicy()
{
SharedAccessExpiryTime
= DateTime.UtcNow.AddHours(1),
Permissions = SharedAccessFilePermissions.Read
//Permissions =
SharedAccessFilePermissions.Write
};
fileSharePermissions.SharedAccessPolicies.Add(policyName,
sharedAccessFilePolicy);
// set permissions of file
share
fileShare.SetPermissions(fileSharePermissions);
// generate SAS token based
on policy and use to create a new file
CloudFileDirectory rootDirectory =
fileShare.GetRootDirectoryReference();
if (rootDirectory.Exists())
{
CloudFileDirectory customDirectory =
rootDirectory.GetDirectoryReference("Shared Content");
if (customDirectory.Exists())
{
CloudFile file = customDirectory.GetFileReference("DemoFile.txt");
string sasToken =
file.GetSharedAccessSignature(null, policyName);
//generate URL of file with
SAS token
Uri fileSASUrl = new Uri(file.StorageUri.PrimaryUri.ToString()
+ sasToken);
CloudFile newFile = new CloudFile(fileSASUrl);
newFile.UploadText("Hello!");
}
}
}
}
catch
(Exception ex)
{
Console.WriteLine("Error: " + ex.Message);
}
finally
{
Console.WriteLine("Enter to exit..");
Console.ReadLine();
}
}
}
Now we will simply change the permission of the policy
and let users give permissions to write and run the code.
Note that the change in is only in a line i.e.
// define policy
SharedAccessFilePolicy sharedAccessFilePolicy = new SharedAccessFilePolicy()
{
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(1),
Permissions = SharedAccessFilePermissions.Read | SharedAccessFilePermissions.Write
};
Code runs successfully, let’s verify if file got created
in file share.
If you download the file and see the content, you should
be able to see Hello! Which we wrote using our console application.
There are lots of other possibilities to configure and
play around the file share using the client storage assemblies and PowerShell
scripts.
Thanks for reading this and your views, comments will be
appreciated.