Data is the key to almost all solutions. Obviously, at some point, we will need to move it. And this is when AzCopy will come to the rescue. In this short post/video I will share how you can securely copy a Zip file (aka. data), from one location (blob storage, AWS) to a blob storage in an Azure subscription (the
same subscription or a different one).
What is AzCopy
AzCopy is a command-line utility that you can use to copy blobs or files to or from a storage account. It can run on
Windows, Mac, and Linux. And... It's already available pre-install inside Cloud Shell!
How it works
AzCopy can do many things but let's focus on the "copy" feature. To copy a file from a location to another one here
the command to execute:
It looks simple, right? And it is. To keep it secure AzCopy can use Shared Access Signature (SAS) tokens. To get those
in Azure, you can execute a command (ex: az storage container generate-sas) or use the Azure Portal.
Once you are in the Azure portal open the account storage of your source or destination (you will have to do both).
From the option on the left search for Shared
access signature or just sas and click on it. Select the type of options you need. A best practice is to
allow the minimum requirements. If you know you are only moving files then unchecked the File, Queue, and Table. Same
things for the resources types, permissions, and expiry date/time. Once you are done click the Generate SAS and
connection string button.
Use those URLs with the SAS token in your command, and voila!
7 best practices for building containers (Google) - This post helps us to improve our skill with containers. It also shares a lot of excellent more detailed references.
How to Lead When You’re Feeling Afraid (Peter Bregman) - A really interesting post that will help us, for sure, because we all leave this kind of situation one day or the other.
Defeating Electron (Felix Rieseberg) - Pretty good is the chance that you are (just like me) using an application builds on top of electron. This post explains a little about what's it happening under the hood.
I have been waiting for this feature for so long! I know; it's not a major feature, but it fills an important gap in the Azure offer. We can now create static websites in the Azure Blob Storage (as I'm writing this post the service is still in preview). In this post, I will explain why I think it's a really good news, show how to create and publish on a static website.
Why It's an Awesome News
The cloud is the perfect place when you need to build something huge very quickly. It's also an excellent solution when you have a lot of variance in the number of resources it required. Because Azure is a service, it will provide you as many resources as you would like in few minutes. And when you are done with the resources you stop paying for them; and it's really great like that!
However, if the only thing you need was to host a little something like a blog or a little website for an event or some temporary publicity Azure was not the best place for it. I mean yes of course, you could build a service and host many little websites on it (Scott Hanselman as excellent posts about that like this one), but it felt always a bit overkill for most of the users. Some people kept an "old style" host provider just for that. I mean it's fine, it works... But with Azure storage, it will be really reliable, and at a lower cost! Let's see how we can create one.
Create a Static Website
To have the static website feature you need to create an Azure Blob Storage account the same way you created them before, however, it needs to be of kind General Purpose V2 (GPV2). Today if you install the Azure CLI Storage-extension Preview, you can use it to create one, or simply go on the portal.azure.com. Let's use the portal since it's more visual.
Once the storage is created, open it. On the left menu of the storage blade, click on the Static website (preview) option. That will open the configuration page for our static website. First, click the Enabled button then enter the initial/ index document name (ex:index.html). Finally, click the Save button on the top of the blade.
The shell for our website is now created. A new Azure Blob Storage container named $web h been created. The Primary and secondary endpoint should now be displayed (ex: https://frankdemo.z13.web.core.windows.net/). If you test this URL, you will see and message saying that the content doesn't exist... and it's normal.
Create some content
This is the part where it all depends on your needs. You may already have some HTML pages ready, or you may want to code them all yourself, or the website may previously exist. For this post, I will create a brand-new blog using a static website generator named Wyam (if you would like to see how to do it with Jekyll, another generator, I used it in the video)
To create a new template with Wyam you use the following command in a command prompt. That will create a new website in the subfolder output.
wyam --recipe Blog --theme CleanBlog
Publish to Azure
It's now time to upload our content to the Azure blob Storage. The easiest is probably directly from the portal. To upload a file, click on the $web container, then the Upload button. From the new form, select the file and upload it.
The main problem with this method is the that it only works one file at the time... And a website usually has many of those...
A more efficient way would be to use Azure Explorer or some script. Azure Explorer doesn't support yet the Azure Storage Static Website, but it will be soon. So that leads us to scripts or command lines.
AzCopy
I really like AZCopy as it's very efficient and easy to use. Unfortunately, as I'm writing this post, AzCopy doesn't support the Azure Storage Static Website. I try to upload all content from the output folder (and sub folders)) with a command like this, but it fails.
An Azure CLI extension preview is also available. Like I mentioned previously, the extension gives you the possibility to create a static website or update the configuration, to upload files you have two options the batch would be more efficient of course, but the file by file option also works. Thanks to Carl-Hugo (@CarlHugoM) for your help with those commands.
I finally tried the Visual Studio Code Stogare Extension. After installing it, you need to add a User Setting Ctrl + ,. Then add "azureStorage.preview.staticWebsites" : true to your configuration. Now you just need to click on the extension, then select Azure blob storage from your subscription, and right click to be able to upload a folder.
Depending on how many files, and their sizes it will take a moment. VSCode will notify you when it's done. You will then be able to get back online and refresh your website to see the result.
Conclusion
I'm very happy to see that feature because it fills a need that was not really cover yet by the Microsoft offer. Right now, it's an early preview so even if the service is very stable, not all the tools support it but that only temporary. Right not you can set your custom domain name, however, HTTPS is not supported.
So what do we do with it? Should we wait or jump right on? Well as the best practices imply when a feature is in preview don't put your core business on it yet. If you are just looking to build a personal website, a little promo than... enjoy!
Why Developers Should Install WSL Today (Matt Hyon, August Banks) - I love WSL (aka Bash on Windows). It has evolved, and become something much more. After ready this post you will either install it right away or smile because it's already installed.
Why Responsive Web Design? (Chris Love) - A nice post that explains clearly why you really need to think responsive design, even more in 2018.
The RULES of Blogging (Darren Rowse) - If you are already bloging, or thinking about starting, take two minutes and watch this short video... The rules are.... Simple.
Copy, Download or Upload from-to any combination of Windows, Linux, OS X, or the cloud
Data is and will always be our primary concern. Whether shaped as text files, images, VM VHDs or any other ways, at some point in time, our data will need to be moved. I already wrote about it previously, and the content of this post is still valuable today, but I wanted to share new options and convert all ground (meaning Linux, Windows and OS X).
Scenarios
Here few scenarios why you would want to move data.
Your Microsoft Azure trial is ending, and you wish to keep all the data.
You are creating a new web application, and all those images need to be moved to the Azure subscription.
You have a Virtual Machine that you would like to move to the cloud or to a different subscription.
...
AZCopy
AzCopy is a fantastic command-line tool for copying data to and from Microsoft Azure Blob, File, and Table storage. At the moment, to write this post AzCopy is only available for Windows users. Another solution will be introduced later in this post for Mac and Linux users. Before AzCopy was only available on Windows. However, recently a second version built with .NET Core Framework is available. The commands are very similar but not exactly the same.
AzCopy on Windows
In his simplest expression, an AzCopy command looks like this:
If you earlier have installed an Azure SDK on your machine, you already have it. By default, AzCopy is installed to %ProgramFiles(x86)%\Microsoft SDKs\Azure\AzCopy (64-bit Windows) or %ProgramFiles%\Microsoft SDKs\Azure\AzCopy (32-bit Windows).
If you need only AzCopy for a server, you can download the latest version of AzCopy.
Let's see some frequent usage. First let's say you need do move all those images from your server to an Azure blob storage.
These examples were simple, but AzCopy is a very powerful tool. I invite you to type one of the following commands to discover more about using AzCopy:
For detailed command-line help for AzCopy: AzCopy /?
For command-line examples: AzCopy /?:Samples
AzCopy on Linux
Before you could install AzCopy you will need to install .Net Core. This is done very simply with few commands.
It is very similar to the original version, but parameters are using -- and - instead of the / and where a : was required, it's now a simple space.
Uploading to Azure
Here an example, to copy a single file GlobalDevopsBootcamp.jpg to an Azure Blob Storage. We pass the full local path to the file into --source, the destination is the full URI, and finally the destination blob storage key. Of course, you could also use SAS token if you prefer.
To copy the image to a second Azure subscription, we use the command the source is now an Azure Storage URI, and we pass the source and the destination keys:
Azure CLI is a set of cross-platform commands for the Azure Platform. It gives tools to manipulate all Azure components, but this post will focus on azure storage features.
There are two versions of the Azure Command-Line Interface (CLI) currently available:
Azure CLI 2.0: written in Python, conpatible only with the Resource Manager deployment model.
Azure CLI 1.0: written in Node.js, compatible with both the classic and Resource Manager deployment models.
Azure CLI 1.0 is deprecated and should only be used for support with the Azure Service Management (ASM) model with "classic" resources.
Installing Azure CLI
Let's start by installing Azure CLI. Of course, you can download an installer but since everything is evolving very fast with not getting it from Node Package Manager (npm). The install will be the same, you just need to specify the version if you absolutely need Azure CLI 1.0.
sudo npm install azure-cli -g
To keep the previous scenario, let's try to copy all images to a blob storage. Unfortunately, Azure CLI doesn't offer the same flexibility as AzCopy,and you must upload the file one by one. However, to upload all images from a folder, we can easily put the command in a loop.
for f in Documents/images/*.jpg
do
azure storage blob upload -a frankysnotes -k YoMjXMDe+694FGgOaN0oaRdOF6s1ktMgkB6pBx2vnAr8AOXm3HTF7tT0NQWvGrWnWj5m4X1U0HIPUIAA== $f blogimages
done
In the previous command -a was the account name, and -k was the Access key. This two information can easily be found in the Azure portal. From the portal (https://portal.azure.com), select the storage account. In the right band click-on Access keys.
To copy a file (ex: a VM disk aka VHD) from one storage to another one in a different subscription or region, it's really easy. This time we will use the command azure storage blob copy start and the -a and -k are related to our destination.
The nice thing about this command is that it's asynchronous. To see the status of your copy just execute the command azure storage blob copy show
azure storage blob copy show -a frankshare -k YoMjXMDe+694FGgOaN0oPaRdOF6s1ktMgkB6pBx2vnAr8AOXm3HTF7tT0NQVxsqhWvGrWnWj5m4X1U0HIPUIAA== imagesbackup 20151011_151451.MOV
Azure CLI 2.0 (Windows, Linux, OS X, Docker, Cloud Shell)
The Azure CLI 2.0 is Azure's new command-line optimized for managing and administering Azure resources that work against the Azure Resource Manager. Like the previous version, it will work perfectly on Windows, Linux, OS X, Docker but also from the Cloud Shell!
Cloud Shell is available right from the Azure Portal, without any plugging.
Uploading to Azure
The command if the same as the previous version except that now the command is named az. Here an example to upload a single file into an Azure Blob Storage.
Let's now copy the file to another Azure subscription. A think to be aware is that --account-name and --account-key are for the destination, even if it's not specified.
If you prefer, I also have a video version of that post.
One More Thing
Sometimes, we don't need to script things, and a graphic interface is much better. For this kind of situation, the must is the Azure Storage Explorer. It does a lot! Upload, download, and manage blobs, files, queues, tables, and Cosmos DB entities. And it works on Windows, macOS, and Linux!
It's just the beginning
This post was just an introduction to two very powerful tools. I strongly suggest to go read in the official documentation to learn more. Use the comment to share all your questions and suggestion.
Securing Web Applications - Simple Talk (Vishwas Parameshwarappa) - Security should be in our priority in this time of APIs and IoT.... Excellent post to get started with multiple security breaches and how to fix them.
Tuples In C# (Mahesh Chand) - Tuples are one feature introduced in .Net 4.0 that can simplify our life a lot. Learn how in this post.