How to use Azure DevOps Pipeline and Cake to generate a static website

I have that little website, a blog that doesn't consume much bandwidth, and I was looking to optimize it. Since Azure blob storage is such a low expensive resource, I thought it would be the perfect fit. I could use a static website generator to transform my markdown file into a nice looking blog and publish that in Azure! Using Azure DevOps pipeline I could at every "git push)" do that all automatically without having anything installed on my machine... meaning I could write a new blog post from anywhere and still be able to update my blog.

In this post, I will explain all the steps required to create a continuous integration and continuous deployment process to deploy a static website into Azure.

The Goal


The idea here is to have on a local machine a folder tracked by git. At every push, we want that change to trigger our CI-CD process. The Build Pipeline will generates the static website. The Release Pipeline will create our Azure resources and publish those artifacts.

The Static Website


In this post, I'm using Wyam.io as static website generator. However, it doesn't matter. There is a ton of excellent generator available: Jekyll, Hugo, Hexo, etc. I selected Wyam because it is written in .Net and If eventually, I want to dig dipper it would be easier for me.

For all those generated websites, it the same pattern. You have an input folder where you have all your posts and images and an output folder that contains the generated result. You don't need to track the content of your output folder, so it would be a good practice to modify the .gitignore file accordingly. As an example here how look mine.

output/

tool/
tools/

wwwroot/

config.wyam.dll
config.wyam.hash
config.wyam.packages.xml

Build Pipeline


The build pipeline will generate our website for us. There so, it needs to have the generator installed. A great tool to do this kind of tasks is Cake. What I like with that tool is that it is cross platform so I can use it without worrying on wish OS it will run.rd.

The Azure pipeline is defined in an azure-pipeline.yml file. Installing Cake should definitely be in our first steps. To know how to do that, navigate to the Get started page of the Cake's website, it's explained that we need to execute a build.ps1 or build.sh (depending on your build setup). That will install Cake and execute the file build.cake. Those files can be found on the GitHub repository as mentioned on the website.

On the Wyam website, in the deployment section of the documentation, you will find a sample for our required build.cake file. It looks like this:

#tool nuget:?package=Wyam&version=2.2.0
#addin nuget:?package=Cake.Wyam&version=2.1.3

var target = Argument("target", "Build");

Task("Build")
    .Does(() =>
    {
        Wyam( new WyamSettings {
            Recipe = "Blog",
            Theme = "CleanBlog"
        });        
    });

Task("Preview")
    .Does(() =>
    {
        Wyam(new WyamSettings
        {
            Recipe = "Blog",
            Theme = "CleanBlog",
            Preview = true,
            Watch = true
        });        
    });

RunTarget(target);

On the first line, it will install the required NuGet package (you should definitely specify the version). Then it defines some tasks, and run the generation command. Create that file at the root of the website folder.

Now let's have a look at the azure-pipeline.yml file.

trigger:
- master

variables:
DOTNET_SDK_VERSION: '2.1.401'

pool:
vmImage: 'VS2017-Win2016'

steps:
- task: DotNetCoreInstaller@0
displayName: 'Use .NET Core SDK $(DOTNET_SDK_VERSION)'
inputs:
    version: '$(DOTNET_SDK_VERSION)'

- powershell: ./build.ps1
displayName: 'Execute Cake PowerShell Bootstrapper'

- task: CopyFiles@2
displayName: 'Copy generated content'
inputs: 
    SourceFolder: '$(Build.SourcesDirectory)/output'
    contents: '**\*' 
    targetFolder: $(Build.ArtifactStagingDirectory)/outpout
    cleanTargetFolder: true

- task: PublishBuildArtifacts@1
displayName: 'Publish Artifact'
inputs:
    PathtoPublish: '$(build.artifactstagingdirectory)'

The first line is to specify the pipeline trigger. In our case, we will look at the master branch. Then I declare a variable to keep the .Net Core version. That way, it will be easier to maintain the script in the future.

The pool command is to specify what kind of server is created. Here I'm using a Windows one, yet I could have used Linux too (all components are cross-platform).

Then comes the list of steps. The first one install .Net Core. The second step is a powershell command to execute our build.ps1 file. At this stage, the static website should be generated in a subfolder output. The last two steps are to copy the content of the output folder into the ArtifactStagingDirectory and then publish it. This way the Release Pipeline can access the artifacts.

There is detailed information about all the commands for a YAML Azure Pipeline file in the documentation. Create your own or copy-paste this one in a new azure-pipeline.yml file under a subfolder named deployment. Once your file is created, commit and push them to GitHub or any repository.

Navigate to Azure DevOps (dev.azure.com). Open your project, or create a new one. Now from the left menu click on the Pipeline (the rocket icon), to create a new one. If you are using an external repository, like me, you will need to authorize Azure DevOps to your repo.

To configure the pipeline, since we already have created the azure-pipeline.yml file, select the Existing Azure Pipeline YAML file option and point it to our file in the deployment folder.



It will open our YAML file. If you wish you could update it. Run it, by clicking to Run blue button in the top-right corner. Your build pipeline is done. Now every time you will push changes into your repository that build will get triggered and generate the static website.

(Next post in the series - Deploy automatically a static website into an Azure Blob storage with Azure DevOps Pipeline)


In a video, please!

I also have a video of this post if you prefer.






References

Reading Notes #370

Cloud


Programming


Databases


Miscellaneous



Reading Notes #369

Cloud


Programming


Miscellaneous


~

Reading Notes #368

Reading Notes #368

Cloud


Programming


Miscellaneous


Books

Tribes: We Need You to Lead Us 

Author: Seth Godin 

A book that will polarize you. In my case, I really enjoyed it until the last page. I felt motivated, and it and was nice.

How to Unzip Automatically your Files with Azure Function v2

I published a video that explains how to UnZip files without any code by using Logic Apps. However, that solution didn't work for bigger files or different archive type. This post is to explain how you can use the Azure Function to cover those situations. This first iteration supports "Zip" files; all the code is available in my GitHub.

Prerequisites


To create the Azure Function, I will use the excellent Azure Function extension of Visual Studio Code. You don't "need" it. However, it makes thing very easy.


You can easily install the extension from Inside Visual Studio Code by clicking on the extension button in the left menu. You will also need to install the Azure Function Core Tools

Creating the Function


Once the extension installed, you will find a new button in the left menu. That opens a new section with four new option: Create New Project, Create Function, Deploy to Function App, and Refresh.


Click on the first option Create New Project. Select a local folder and a language; for this demo, I will use C#. This will create a few files and folder. Now let's create our Function. From the extension menu, select the second option Create Function. Create a Blob Trigger named UnzipThis into the folder we just created, and select (or create) Resource Group, Storage Account, and location in your subscription. After a few seconds, another question will pop asking the name of the container that our blob trigger monitors. For this demo, input-files is used.

Once the function is created you will see this warning message.


What that means is that to be able to debug locally we will need to set the setting AzureWebJobsStorage to UseDevelopmentStorage=true in the local.settings.json file. It will look like this.

{
    "IsEncrypted": false,
    "Values": {
        "AzureWebJobsStorage": "UseDevelopmentStorage=true",
        "FUNCTIONS_WORKER_RUNTIME": "dotnet",
        "unziptools_STORAGE": "DefaultEndpointsProtocol=https;AccountName=unziptools;AccountKey=XXXXXXXXX;EndpointSuffix=core.windows.net",
    }
}

Open the file UnzipThis.cs; this is our function. On the first line of the function, you can see that the Blob trigger is defined.

[BlobTrigger("input-files/{name}", Connection = "cloud5mins_storage")]Stream myBlob

The binding is attached to the container named input-files, from the storage account reachable by the connection "cloud5mins_storage". The real connectionString is in the local.settings.json file.

Now, let's put the code we need for our demo:

[FunctionName("Unzipthis")]
public static async Task Run([BlobTrigger("input-files/{name}", Connection = "cloud5mins_storage")]CloudBlockBlob myBlob, string name, ILogger log)
{
    log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name}");

    string destinationStorage = Environment.GetEnvironmentVariable("destinationStorage");
    string destinationContainer = Environment.GetEnvironmentVariable("destinationContainer");

    try{
        if(name.Split('.').Last().ToLower() == "zip"){

            CloudStorageAccount storageAccount = CloudStorageAccount.Parse(destinationStorage);
            CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
            CloudBlobContainer container = blobClient.GetContainerReference(destinationContainer);
            
            using(MemoryStream blobMemStream = new MemoryStream()){

                await myBlob.DownloadToStreamAsync(blobMemStream);

                using(ZipArchive archive = new ZipArchive(blobMemStream))
                {
                    foreach (ZipArchiveEntry entry in archive.Entries)
                    {
                        log.LogInformation($"Now processing {entry.FullName}");

                        //Replace all NO digits, letters, or "-" by a "-" Azure storage is specific on valid characters
                        string valideName = Regex.Replace(entry.Name,@"[^a-zA-Z0-9\-]","-").ToLower();

                        CloudBlockBlob blockBlob = container.GetBlockBlobReference(valideName);
                        using (var fileStream = entry.Open())
                        {
                            await blockBlob.UploadFromStreamAsync(fileStream);
                        }
                    }
                }
            }
        }
    }
    catch(Exception ex){
        log.LogInformation($"Error! Something went wrong: {ex.Message}");

    }            
}

UPDATED: Thanks to Stefano Tedeschi who found a bug and suggested a fix.

The source of our compressed file is defined in the trigger. To define the destination destinationStorage and destinationContainer are used. Their value are saved into local.settings.json. Then because this function only supports .zip file a little validation was required.

Next, we create an archive instance using the new System.IO.Compression library. We then create references to the storage account, blob, and container. It not possible to used second binding here because for one archive file you have a variable number of potential extracted files. The bindings are static; therefore we need to use the regular storage API.

Then for every file (aka entry) in the archive the code upload it to the destination storage.

Deploying


To deploy the function, from the Azure Function extension click on the third option: Deploy to Function App. Select your subscription and Function App name.

Now we need to configure our settings in Azure. By default, the local.setting are NOT used. Once again the extension is handy.


Under the subscription expand the freshly deployed Function App AzUnzipEverything, and right-click on Application Settings. Use Add New Setting to create cloud5mins_storage, destinationStorage and destinationContainer.

The function is now deployed and the settings are set, now we only need to create the blob storage containers, and we will be able to test the function. You can easily do that directly from the Azure portal (portal.azure.com).

You are now ready to upload a file into the input-files container.

Let's Code Together


This first iteration only supports "Zip" files. All the code is available on GitHub. Feel free to use it. If you would like to see or add support for other archive types join me on GitHub!.

In a video, please!


I also have a video of this post if you prefer.



I also have an extended version where I introduce more the Visual Studio Extension to work with Azure Function. And explain more details about the Azure Function V2.

Reading Notes #367

Suggestion of the week


Cloud


Programming


Miscellaneous


My journey as a technical YouTuber


Sixty-five videos and one year later
One year already that I start sharing videos on YouTube, I've been blogging for many years and wanted to try something new. Creating videos seems like the next logical step. And this is how I decided to start sharing short videos to answer technical questions about Microsoft Azure on YouTube. This post, I will explain what I learned during the first year of my journey.

The Beginning

I started my YouTube channel in French. During the two first months, I published one video by week. I learned a lot through this period, how to prepare my code snippets, my files, and my screen. The biggest takeaway was that: preparation is the key. The more prepared you are, the more efficient you will be in front of the camera. I also discover that compare to writing a blog post; you need to pay more attention, way earlier in the process, to the tiny details.

It may sound like a cliché, but it was challenging to watch myself. But it's the best way to improve. You need to accept those mistakes and promise yourself that next time, you will do better. It's normal and okay not to be perfect.

Let's be bilingual

One thing that I was looking forward was to be able to share my videos with all my clients/ friends/ community members. However, I couldn't because while some are perfectly comfortable in French (and only French), others only understand English. And this is why I decided to record all my videos in both French and English. I knew that would mean doubling the workload, so I adapted my schedule to publish every second week.

Doing all does videos I found my beat, my style, the things I like and dislike. I also spend a lot of time learning the YouTube platform through books and by watching other YouTubers. It's such an incredible resourceful community!

A New Camera

The software (Camtasia from TechSmith)  I was using, and that I'm still using today, can record both your screen and your webcam. It keeps all the tracks separated, and this is great because you can change the size and position of the mortise in post-production.

My challenge was that I wanted to record both in 1080P (full HD) because during my intros and conclusion when the webcam input fill the screen I want good image quality. However, the software (or more likely my PC), did not let me record in 1080P (full HD) the screen and the webcam input.
To upgrade the quality of my recordings, I decided to use my DSLR instead of my webcam. Of course, it represents more work in post-production because now I need to synchronize the two videos sources, but now everything is 1080.

The famous algorithm

The more I was studying my statistics and reading comments (thank you all for your constructive comments by the way), I started to understand that having all my videos in French and English on the same channel was not the best idea.

First, for the YouTube algorithm, it was very confusing because it couldn't identify the video's language. Therefore, it's harder to do its job and make recommendations.

Secondly, it didn't cross my mind at the beginning, because I understand both languages, but for a subscriber that understand only one of the language, it's was very confusing and not very pleasing to have all those "others" un-useful videos in the feed.

After a very long (and hard) reflection, I decided to create a second channel and "move" all my French content to that channel. I say "move" but you cannot "move" contents on YouTube, and that was the principal reason why I was hesitating to split my channel. By creating a new channel, and re-uploading my older (aka French) videos I was losing all my stats (views, subscribers), and references.

The effect was immediate on the English channel as the views and subscribers exploded a week after. And on the other side, my brand new French channel didn't really catch-up. Never the less I intend to continue to create French content. ;)

Today's Status

As this post gets published, I have a cumulative total of sixty-five videos online and more than a thousand subscribers. I publish every second week, usually on Thursday, two videos on the same topic one in French on fr.cloud5mins.com and one in English on cloud5mins.com. I have a fantastic community super supportive, asking questions and suggesting topics.



I'm very thankful for all that positive feedback. To help me you can subscribe to my YouTube channel of course, or become one of my Patreon at patreon.com/fboucheros

What's Next

I'm definitely continuing to create videos. I plan to start doing some streaming on Twitch, where I will building cloud solutions.

See you soon,


Reading Notes #366

Cloud


Programming


Databases


Miscellaneous



Reading Notes #365

Cloud


Miscellaneous


Books


I Hope I Screw This Up: How Falling In Love with Your Fears Can Change the World (Kyle Cease)

I was definitely not expecting that, when I picked up this book, but I am happy I did.

This "self-help" book is filled with a ton of comedy and I appreciated it. I felt like my best friend was talking about a serious topic but because he was in a crazy good mood was just having a good time telling me his story. Simple and real. It leaves you with a lot to think about.





Reading Notes #364

Cloud




    Programming



    Miscellaneous




    ~

      Reading Notes #363


      Cloud




      Programming




      Databases




      Miscellaneous


      ~


      Reading Notes #362



      raquetteSuggestion of the week



      Cloud



      Programming



      Miscellaneous



      ~

      Reading Notes #361

      MVIMG_20190110_075409_2Cloud



      Programming



      Miscellaneous



      Books

      ted-talk


      TED Talks: The Official TED Guide to Public Speaking

      Author: Chris J. Anderson

      Fantastic book that covers a lot of topics related to presenting. It covers the before, during, and even the after very smartly. There are no recipes here and this is exactly what makes this book so great. A must.











      ~


      Reading Notes #360

      insta-IMG_20181225_135719

      Cloud



      Programming


      Miscellaneous


      ~


      Reading Notes #359

      DockerDesktopCloud


      Programming


      Databases


      Miscellaneous


      Books

      How to Be a Bawse_cover
      How to Be a Bawse A Guide to Conquering Life
      Lilly Singh
      Not only the message is strong, but the way she delivers it is awesome. Many times I laugh and nod of the head... Definitely a great book to read at the end of the year when resolution time is not far...









      ~

      Reading Notes #358

      CakeLogoCloud


      Programming


      Miscellaneous


      ~

      Reading Notes #357

      71lKlEYntPL._SL1500_Suggestion of the week


      Cloud


      Programming


      Miscellaneous



      ~


      Reading Notes #356

      IMG_20181128_122246Suggestion of the week

      • Security Headers (Tanya Janca) - Interesting post that shows the code/configuration we need to add, in order to get a more secure website.

      Cloud


      Programming


      Miscellaneous


      Books

      fast_focus_coverFast Focus: A Quick-Start Guide To Mastering Your Attention, Ignoring Distractions, And Getting More Done In Less Time! (Damon Zahariades) - Great book well organized. Simple strike to the point. It is divided into three parts: understanding focus, creating an environment for focus, and employing tactics to focus. It lists the top 10 obstacles to staying focused and gives you a great idea on how to get start your journey.











      ~


      Reading Notes #354

      Cloud


      Programming


      Books

      Extreme Ownership_coverExtreme Ownership: How U.S. Navy SEALs Lead and Win (Jocko Willink, Leif Babin) - Very interesting book. Yes, it contains a lot of battle details, and first I was not sure, but then things "fall" all in place when you understand what the story was "demonstrating." It also contains more business focus examples. Everything is very clear, well explained in plain English.









      ~

      Reading Notes #353

      aliasesSuggestion of the week



      Cloud



      Programming



      Books

      The_FIrst_20_HoursThe First 20 Hours: How to Learn Anything...Fast (Josh Kaufman) - That was a very interesting book. I devoured it much faster than I thought! As I was reading it, I was thinking... hey as a developer/programmer I already do a lot of those things... Things change so fast, technology changes... And a few pages later, the author was saying the same thing. :) It looks like we can adapt to many deferent situations. The author share with us a few journeys as he was learning new stuff. While I may not be interested to learn how to play to Go, I found all part of the book very interesting as those journeys a pact with tons of information.
      ISBN: 1591845556 (ISBN13: 9781591845553)


      Miscellaneous


      ~

      Reading Notes #352

      Suggestion of the week


      Cloud


      Programming


      Data


      Miscellaneous


      Books

      • The Personal MBA: Master the Art of Business (Josh Kaufman) - An interesting book that helps to get in the mood, get prepared, and maybe for some whom weren't sure yet about the idea of a personal MBA (compare to the regular one)... Help to start planning and get moving. This book isn't an one book miracle MBA certification, but most likely a really good way to understand the journey the reader is about to start. The complete list of books to achieve this adventure is constantly updated and is available online. 


      ~

      How to create an Azure Container Instance (ACI) with .Net Core

      For a project I just started, I need to create Azure resources from code. In fact, I want to create an Azure Container Instance. I already know how to create a container from Logic Apps and Azure CLI/PowerShell, but I was looking to create it inside an Azure Function. After a quick research online, I found the Azure Management Libraries for .NET (aka Fluent API) a project available on Github that do just that (and so much more)!
      In this post, I will share with you how this library work and the result of my test.

      The Goal


      For this demo, I will create a .Net Core console application that creates an Azure Containter Instance (ACI). After it should be easy to take this code and migrate to an Azure Function or anywhere else.

      hello-container

      The Console Application


      Let's create a simple console application with the following command: dotnet new console -o AzFluentDemo cd AzFluentDemo dotnet add package microsoft.azure.management.fluent The last command will use the nuget package available online an add it to our solution. Now we need a service principal so our application could access the Azure subscription. A since way to create one is the use Azure CLI az ad sp create-for-rbac --sdk-auth > my.azureauth This will create an Active Directory (AD) Service Principal (SP) and write the content into the file my.azureauth. Perfect, now open the solution, for this kind of project, I like to use Visual Studio Code so code . will do the work for me. Replace the content of the Program.cs file by the following code.

      using System;
      using Microsoft.Azure.Management.Fluent;
      using Microsoft.Azure.Management.ResourceManager.Fluent;
      using Microsoft.Azure.Management.ResourceManager.Fluent.Core;
      namespace AzFluentDemo
      {
          class Program
          {
              static void Main(string[] args)
              {
                  string authFilePath = "/home/frank/Dev/AzFluentDemo/my.azureauth";
                  string resourceGroupName  = "cloud5mins";
                  string containerGroupName = "frank-containers";
                  string containerImage  = "microsoft/aci-helloworld";
                  // Set Context
                  IAzure azure = Azure.Authenticate(authFilePath).WithDefaultSubscription();
                  ISubscription sub;
                  sub = azure.GetCurrentSubscription();
                  Console.WriteLine($"Authenticated with subscription '{sub.DisplayName}' (ID: {sub.SubscriptionId})");
                  // Create ResoureGroup
                  azure.ResourceGroups.Define(resourceGroupName)
                      .WithRegion(Region.USEast)
                      .Create();
                  // Create Container instance
                  IResourceGroup resGroup = azure.ResourceGroups.GetByName(resourceGroupName);
                  Region azureRegion = resGroup.Region;
                  // Create the container group
                  var containerGroup = azure.ContainerGroups.Define(containerGroupName)
                      .WithRegion(azureRegion)
                      .WithExistingResourceGroup(resourceGroupName)
                      .WithLinux()
                      .WithPublicImageRegistryOnly()
                      .WithoutVolume()
                      .DefineContainerInstance(containerGroupName + "-1")
                          .WithImage(containerImage)
                          .WithExternalTcpPort(80)
                          .WithCpuCoreCount(1.0)
                          .WithMemorySizeInGB(1)
                          .Attach()
                      .WithDnsPrefix(containerGroupName)
                      .Create();
                  Console.WriteLine($"Soon Available at http://{containerGroup.Fqdn}");
              }
          }
      }

      In the first row, I declare a few constants. The path of the service principal created earlier, resource group name, the container group name, and the image I will use. For this demo aci-helloworld. Then we get access with the Azure.Authenticate. Once we got access, it's y easy and the intellisense is fantastic! I don't think I need to explain the rest of the code as it already self-explanatory.

      Got an Error?


      While running you main in contour an error message complaining about the namespace not being registered or something like that ( I'm sorry I did not note the error message). You only need to register it with the command:

      az provider register --namespace Microsoft.ContainerInstance

      It will take a few minutes. To see if it's done you can execute this command:

      az provider show -n Microsoft.ContainerInstance --query "registrationState" 

      Wrap it up


      And voila! If you do a dotnet run after a minute or two, you will have a new web application running inside a container available from http://frank=containers.eastus.azurecontainer.io. It's now very easy to take that code and bring it to an Azure Function or in any .Net Core Application that runs anywhere (Linux, Windows, Mac Os, web, containers, etc.)!


      In a video, please!


      I also have a video of this post if you prefer.



      References




      ~