I recently presented, a workshop at the TOHack to get started with Azure. The goal was to try different Azure services, and see how we could augment an existing website using serverless function and artificial intelligence. (Aussi disponible en français)
During this workshop, a website is deployed automatically from GitHub. Then by adding an Azure Function and using the Vision API of Azure Cognitive Services, the final solution is able to detect when uploaded pictures are or not dogs and keep our image folder "clean". We call that application: The automatic Not a Dog application.
The step by step instruction with the code can be found on GitHub - Not-a-Dog-Workshop. The workshop can be done in about 45-60 minutes.
I also did a video that is available on my YouTube channel:
You have questions, you are blocked, it will be a pleasure to help you.
Install WSL 2 on Windows 10 (Thomas Maurer) - Awesome tutorial. If like me you didn't want to wait until the next Windows release or take the time to compile and debug a deployment....this tutorial is for us!
In a project using Azure Logic Apps that I am working on, I needed to manipulate strings. I could create APIs or Azure Functions, but the code is very simple and is not using any external libraries. In this post, I will show you how to use the new Inline Code to execute your code snippet directly inside your Logic Apps.
Quick Context
The Logic App will read a file from my OneDrive (it will also work with DropBox, Box, etc.). Here an example of the file:
Nice tutorial that explains how to build, using postman, an efficient API.[cloud.azure.postman.tools]
The goal is to extract tags, contained between the square brackets, from the text.
Logic App: Get File Content
From the Azure Portal, create a new Logic App by clicking the big green "+" button in the top left corner and searching for Logic App.
For this demo, I will use the Interval as a trigger because I will execute the Logic App manually.
The first step will be a Get File Content action from the OneDrive connector. Once you authorized Azure to access your OneDrive folder, select the file you want to read. For me, it's /dev/simpleNote.txt
Integration Account
To access the workflowContext the Azure Logic App required an Integration account. Next step would be to create one. Save the current Logic App, and click on the big "+" button in the top right corner. This time search for integration. Select Integration Account, and complete the form to create it.
We now need to assign it to our Logic App. From the Logic App blade, in the options list select Workflow Settings. Then select your integration account, and don't forget to save!
Logic App: Inline Code
To add the action at the end of your workflow, click the New step button. Search for Inline Code, and select the action Execute JavaScript Code.
Before copy-pasting the code into the new Inline Code action let's have a quick look.
var note = "" + workflowContext.actions.Get_file_content.outputs.body;
var posTag = note.lastIndexOf("[") + 1;
var cleanNote = {};
if(posTag > 0){
cleanNote.tags = note.substring(posTag, note.length-1);
cleanNote.msg = note.substring(0,posTag-1);
}
return cleanNote;
On the first line, we assign a variable note the content of the Get_file_content outputs. We access it using the workflowContext. This context has access to the trigger and the actions. To find the name of the action you can replace the spaces by the underscore character "_".
You can also switch to Code View, and see the name of all components from the JSON code.
Logic App: Use Inline Code Result
Of course, you can use the output of your Inline Code with other steps. You just need to use the Result from the dynamic content menu.
If for some reason the dynamic content list doesn't contain your Inline Code, you can always add the code directly @body('Cleaning_Note')?['body'].
Your Logic App should now look like this:
Verdict
The Inline code is very promising. Right now it's limited to JAvaScript and cannot access variable nor loops. However, for simple code that doesn't require any references, it's easier to maintain and deploy. You can learn more about what is exactly covered or not here.
And it works as this result shows.
Top 10 C# Developer Books for Summer 2019 (Claudio Bernasconi) - Great list of books to get started with C# or as a developer. I'll definitely refer people to it when asked where to start.
Presentation Tips for Technical Talks (Tanya Janca) - This post is filled with great and simple tips that will for sure improve the experience of your attendees and ours.
Nice book. There is always a good story to make a correlation with his current point. Then it could go in a different direction with another story. All the stories are complementary and are adding layer by layer to the more complex message that is delivered to us. Easy to read, enjoyable from the beginning until the last word.
You are done with your code and you are ready to deploy it in Azure. You execute the PowerShell or Bash script you have and BOOM! The error message saying that this name is already taken. In this post, I will show you a simple way to look like a boss and make your deployment working all the time.
____ with given name ____ already exists.
The tricks other use
You could try to add a digit at the end of the resource name (ex: demo-app1, demo-app2, demo-app123...), but that’s not really professional. You could create a random string and append it to the name. Yes, that will works, once. If you are trying to redeploy your resources that value will change, therefore it will never be the same.
The solution would be to have a unique string that is constant in our environment.
The solution
The solution is to use the function UniqueString() part of the Azure Resource Manager (ARM) template. If we look in the documentation, UniqueString creates a deterministic hash string based on the values provided as parameters. Let’s see a quick example of an ARM template to deploy a website named demo-app.
If you try to deploy this template, you will have an error because the name demo-app is already taken... no surprise here.
Let’s create a new variable suffix and we will use the Resource Group Id and Location as values. Then we just need to append this value to our name using the function concat().
It’s that simple! Now every time you will deploy a unique string will be added to your resource name. That string will always be the same for a Resource Group-Location deployment.
Because some resource types are more restrictive than others you may need adapt your new name. Maybe the name of your resource plus those thirteen characters hash will be too long... No problem, you can easily make it shorter and all lower case just by using substring() and toLower().
Voila, and now by using ARM template you can deploy and redeploy without any problem reproducing the same solution you built. To learn move about the ARM template you can jump in the documentation, where you will find samples, step-by-step tutorials and more.
If you have a specific question about ARM templates or if you would like to see more tips like this one, don't hesitate to ask in the comments section or reach out on social media!
Docker from the beginning — part III (Chris Noring) - Third of this docker series. I like how it is not only a happy path but the learning path with the fails and victories.
Improve your Dockerfile, best practices (Chris Noring) - A nice quick post about some really easy best practices. It's so simple why would you not follow them.
Introducing Windows Terminal (Kayla Cinnamon) - The awesome new terminal with a kickass look. Even more, it's an open source project!
Atomic Habits: An Easy & Proven Way to Build Good Habits & Break Bad Ones (James Clear) - An excellent book that is very pleasant to read. I really appreciated the way things are broken in tiny pieces. I don't think this book re-invented the molecular physic, but by cutting, dissecting our habits that way it's hard to think that you can fail. It's easier to get started right now; even starting new habits before finishing the book!
Top 15 Visual Studio Code Extensions in 2019 (Gift Egwuenu) - Nice list of extension that targets primarily JavaScript developers but there's definitely a few gems for all of us.
Podcast
Anthos Migrate, with Issy Ben-Shaul (Kubernetes Podcast from Google) - Nice update. I like the talk about Anthos it look like a great migration tool. I need to find that GitHub repo...
Azure Functions using Node with Simona Cotin (.NET Rocks!) - Great show. I just switch my website following that Jam stack pattern. I was planning to use Azure Functions to add a few little twists.... I'm happy to see that I not alone thinking like that!
0230 - Alain Vezina - Le métier du DevOps (Visual Studio Talk Show) - Super épisode, très intéressant d'entendre parler du rôle de DevOps de quelqu'un qui le vie au quotidien. Merci de la suggestion, je crois, bien que je suis du pour relire The Pheonix Project.
Goal Setting Tips & Tracking KPIs (Video Pursuit Podcast) - Really interesting episode. Everybody is talking about matrix and KPI... But it's not frequent to hear about the "how". I really like how the goals are explained, achievable, but not easy... And how we should react when we don't reach them.
Avoiding Azure Functions Cold Starts (Mark Heath) - Does cold starts are affecting your solutions? Maybe not, but if they are this post lists three scenarios to reduce them as must as possible.
A really interesting book that helps to focus and keep in mind the most important. I didn't read it with a purpose of business really, but it did make me remember past experiences and it was easy to make the relationship between success and when the story was clear. Take the time to read it, do the exercises/ reflections required... it's worth it.
How Azure Resource Graph is gonna change the way you search and script (Stephane Lapointe) - Whaat?! 15x faster! If you are not using Azure graph yet... This post is for you. If you do use graph, still read that you may learn a few tricks. In short, it's a mandatory read for anyone using Azure.
Cloud
Azure Blueprints: ISO27001 Shared Services (Eric Leonard) - This excellent second post of a series goes dipper and shares details about one specific blueprint template. And explains some pitfall to avoid.
Azure Blueprints: Intro (Eric Leonard) - If you don't know Blueprint this post is an excellent first contact.
Was MongoDB Ever the Right Choice? (Justin Etheredge) - Nice post that put in perspective what's NoSql and why MondoDB could or not be a good solution for our project.
Static websites are lightning fast, and running them inside an Azure Blob Storage instead of a WebApp is incredibly economical (less than $1/ month). Does it mean you need to do everything manually? Absolutely not! In a previous post I explained how to automatically generated your static website using a Build Pipeline inside Azure DevOps. In this post, let's complete the CI-CD by creating a Release Pipeline to deploy it.
The Azure Resource Manager (ARM) Template
First thing first. If we want our release pipeline to deploy our website in Azure, we need first to be sure our Resources are available "up there." The best way to do this is by using an Azure Resource Manager (ARM template). I will use the same project started in the previous post, feel free to adapt to your structure or copy it from it.
Create a new file named deploy.json in the deployment folder. We need a simple storage account.
I used a parameter (StorageName) to define the name of the storage account. This way I could have multiple pipelines deploying in different storages.
Not to make the ARM template accessible to the release pipeline we also need to publish it. The easiest way to do it is to add another Copyfile task in our azure-pipeline. Add this task just before the PublishBuildArtifacts.
Once you commit and push these changes, it will trigger a build. When done, the ARM template will be available, and we will be able to start working on the release pipeline.
The Release Pipeline
Navigate to the DevOps project created in the previous post. This time, create a new Release Pipeline. When asked, select an empty template, we will pick manually the tasks we need.
First, we need to define the trigger and where are our artifacts. Click on the thing at the left of the screen. Select the build projects and let's use the latest version of the artifact to our deployment.
To get a continuous deployment, you need to enable it by clicking on the lightning bolt and selecting the enabled button.
Now let's select our tasks. Click on the "+" sign to add new tasks. We need three of these: Azure Resource Group Deployment, Azure CLI, and Azure File Copy.
Task 1 - Azure Resource Group Deployment
The first one will be an Azure Resource Group Deployment. The will be used to deploy our ARM template and be sure that the resources are available in Azure.
To configure the ARM deployment we need to select the Azure subscription and authorize the pipeline to have access. Then you will need to specify the name of the resource group you will be deploying into; it's location and finally points where is the linked ARM template.
Task 2 - Azure CLI
The second one is an Azure CLI. As I am writing this post, it's not possible to enable the static website property of a storage account. Therefore we will execute an Azure CLI command to change that configuration. Once you picked the Azure subscription, select inline script and enter this Azure CLI command:
az storage blob service-properties update --account-name wyamfrankdemo --static-website --index-document index.html
This will enable the static website property of the storage account named wyamfrankdemo, and set the default document to index.html.
Task 3 - Azure File Copy
The last task is an Azure File Copy to copy all our files from $(System.DefaultWorkingDirectory)/drop/drop/outpout to the $web container (in our Azure Blob storage). The container must be named $web, that's the name used by Azure for the static website.
Wrapping up
Once you are done configuring the Release Pipeline, it's time to save it and run it. After only a minute or two (this demo is pretty small) the blog should be available into Azure. To find your endpoint (aka URL) you can go into the portal.azure.com and look at the static website property of the blob storage that we just create.
I have that little website, a blog that doesn't consume much bandwidth, and I was looking to optimize it. Since Azure blob storage is such a low expensive resource, I thought it would be the perfect fit. I could use a static website generator to transform my markdown file into a nice looking blog and publish that in Azure! Using Azure DevOps pipeline I could at every "git push)" do that all automatically without having anything installed on my machine... meaning I could write a new blog post from anywhere and still be able to update my blog.
In this post, I will explain all the steps required to create a continuous integration and continuous deployment process to deploy a static website into Azure.
The Goal
The idea here is to have on a local machine a folder tracked by git. At every push, we want that change to trigger our CI-CD process. The Build Pipeline will generates the static website. The Release Pipeline will create our Azure resources and publish those artifacts.
The Static Website
In this post, I'm using Wyam.io as static website generator. However, it doesn't matter. There is a ton of excellent generator available: Jekyll, Hugo, Hexo, etc. I selected Wyam because it is written in .Net and If eventually, I want to dig dipper it would be easier for me.
For all those generated websites, it the same pattern. You have an input folder where you have all your posts and images and an output folder that contains the generated result. You don't need to track the content of your output folder, so it would be a good practice to modify the .gitignore file accordingly. As an example here how look mine.
The build pipeline will generate our website for us. There so, it needs to have the generator installed. A great tool to do this kind of tasks is Cake. What I like with that tool is that it is cross platform so I can use it without worrying on wish OS it will run.rd.
The Azure pipeline is defined in an azure-pipeline.yml file. Installing Cake should definitely be in our first steps. To know how to do that, navigate to the Get started page of the Cake's website, it's explained that we need to execute a build.ps1 or build.sh (depending on your build setup). That will install Cake and execute the file build.cake. Those files can be found on the GitHub repository as mentioned on the website.
On the Wyam website, in the deployment section of the documentation, you will find a sample for our required build.cake file. It looks like this:
On the first line, it will install the required NuGet package (you should definitely specify the version). Then it defines some tasks, and run the generation command. Create that file at the root of the website folder.
Now let's have a look at the azure-pipeline.yml file.
The first line is to specify the pipeline trigger. In our case, we will look at the master branch. Then I declare a variable to keep the .Net Core version. That way, it will be easier to maintain the script in the future.
The pool command is to specify what kind of server is created. Here I'm using a Windows one, yet I could have used Linux too (all components are cross-platform).
Then comes the list of steps. The first one install .Net Core. The second step is a powershell command to execute our build.ps1 file. At this stage, the static website should be generated in a subfolder output. The last two steps are to copy the content of the output folder into the ArtifactStagingDirectory and then publish it. This way the Release Pipeline can access the artifacts.
There is detailed information about all the commands for a YAML Azure Pipeline file in the documentation. Create your own or copy-paste this one in a new azure-pipeline.yml file under a subfolder named deployment. Once your file is created, commit and push them to GitHub or any repository.
Navigate to Azure DevOps (dev.azure.com). Open your project, or create a new one. Now from the left menu click on the Pipeline (the rocket icon), to create a new one. If you are using an external repository, like me, you will need to authorize Azure DevOps to your repo.
To configure the pipeline, since we already have created the azure-pipeline.yml file, select the Existing Azure Pipeline YAML file option and point it to our file in the deployment folder.
It will open our YAML file. If you wish you could update it. Run it, by clicking to Run blue button in the top-right corner. Your build pipeline is done. Now every time you will push changes into your repository that build will get triggered and generate the static website.
First steps with Docker and Kubernetes - Introduction (Matteo Pagani) - Wow, fantastic post to get started with Kubernetes the author mention that after reading this you won't be an expert... However, you will definitely know enough to be dangerous.