Reading Notes #351

MVIMG_20181111_190706

Cloud


Programming


Data


~


Reading Notes #350



markdig

Cloud


Programming


Miscellaneous


Books

Fast FocusFast Focus (Damon Zahariades) - Great short book. Not like the other of his kind, this book goes right to the point and offers actionable item. It's very practical and accessible to everyone. At the end of the book, you know what to do to get started and improve your focus.



Let’s create a continuous integration and continuous deployment (CI-CD) with Azure DevOps

I'm about to start a new project and want to have it with a continuous integration (CI) and continuous deployment (CD). I've been using VSTS for a while now but didn't have the chance to try the new pipelines. If you didn't know VSTS as been rebranded/ redefined as Azure Devops. Before going in with the real thing I decided to give it a try with a simple project. This post is to relay those first steps.

Get Started


Let's start by creating our Azure Devops project. Navigate to Dev.Azure.com and if you don't already have an account create one it's free! Once you are logged-in, create a new project by clicking the New project blue button in the top right corner.

createNewProject

You will need to provide a unique name and a few simple information.

The Application


First thing first, we need an application. For this post, I will be using a simple Asp.Net Core site. For the repository, we have options. AzureDevOps (ADOps) support many repository: GitHub, Bitbucket, private Git and its own. Because the project I've created is public I decided to keep the code at the same place as everything else.

From the left menu, select Repos. From here if the code already exist just add a remote repository, or clone the empty one on your local machine, the usual. Create and add your code to that repository.

Repos

The Azure WebApp


The next step is to create a placeholder for our CD pipeline. We will create an empty shell of a web application in Azure with these three Azure CLI commands. You can execute them locally or from the Cloud Shell. (Don't forget to validate that you are in the good subscription)
az group create --name simplegroup --location eastus

az appservice plan create --name simpleplan --resource-group simplegroup --sku FREE

az webapp create --name simplefrankweb --resource-group simplegroup --plan simpleplan
The first command will create a Resource group. Then inside of this group we create a service plan, and finally we create a webapp to the mix.

Continuous Integration


The goal is to have the code to get to compile at every commit. From the left menubar, select Pipelines, and click the create new button. The first step is to identify where our code is, as you can see Azure DevOps is flexible and accept code from outside.

NewPipeline_step1

Select the exact repository.

NewPipeline_step2

This third step displays the YAML code that defines your pipeline. At this point, the file is not complete, but it's enough to build, we will come back to it later. Click the Add button to add the azure-pipelines.yml file at the root level of your repository.

NewPipeline_step3

The build pipeline is ready click the Run button to execute it for the first time. Now at every commit, the build will be triggered. To see the status of your build just on to into the build section from the left menubar.

buildSuccess

Continuous Deployment


Great, our code gets to compile at every commit. It would be nice if the code could also be automatically deployed into our dev environment. To achieve that we need to create a Release Pipeline. And our pipeline will need artifacts. We will edit the azure-pipelines.yml to add two new tasks. You can do this directly in the online repository or just from your local machine; remember the file is at the root. Add these commands:

- task: DotNetCoreCLI@2
  displayName: 'dotnet publish $(buildConfiguration)'
  inputs:
    command: publish
    publishWebProjects: True
    arguments: '--configuration $(buildConfiguration) --output $(Build.ArtifactStagingDirectory)'
    zipAfterPublish: True

- task: PublishBuildArtifacts@1
  displayName: 'publish artifacts'

Those two tasks are to publish our application (package it), and make it available in our Artifact folder. To learn more about the type of command available and see example have a look the excellent documentation at: https://docs.microsoft.com/azure/devops/pipelines/languages/dotnet-core. Once you are done, save and commit (and push if it was local).

From the left menubar, click on e the Pipeles, select Release, and clienk the New Release blue button. Select the template that matches your application. For this post Azure App Service deployment is the one we need.

NewRelease_step1
The next thing to will be to rename the environment for something else than Stage 1, I named mine "to Azure" but it could be dev, prod or anything that make sense for you. Click on the Add an Artifact button.

ReleasePipeline

You will now specify to the pipeline were to pick the artifacts it will deploy. In this case, we want the "output" of our latest build. And I renamed the Source alias as Drop.

AddArtifact

To get our continuous deployment (CD) we need to enable that trigger by clicking on the little lightning bolt and enabled it.

TriggerRelease

The last step to configure the Release pipeline is to specify a destination. By clicking on the "1 job, 1 task" in the middle of the screen (with the little red exclamation point in a circle), that will open the window where we will do that.

Select the subscription you would like to use, and then click on the Authaurize button on the right. Once it's done go change the App Service Name. Click on it and wait 2-3 seconds you should see the app we created with our Azure CLI display. Select it, and voila!

SetupReleaseDetails

Now add a ReadMe.md file by checking out the code on your local machine or directly in Azure DevOps. Grab a badge from the build and/or release and copy paste it in the ReadMe. To get the code snippet of your badge, go to your build/ release definition, and click the ellipse button. Select Status badge and copy the snippet that matches your destination file (in our case the Markdown).

GetaBadge_2

Now when you go to the Overview page, you will have a nice badge that informed you. It also works on any web page just use the HTML snippet instead.

simple_frank

In a video, please!


I also have a video of this post if you prefer.




References:



Reading Notes #349

playWithDockerCloud


Programming


Miscellaneous


~ Enjoy!

Reading Notes #348

IMG_20181016_073145

Cloud


Programming


Data


Miscellaneous


~enjoy!

Reading Notes #347

MVIMG_20181011_103658

Cloud


Programming


Data


Miscellaneous



Reading Notes #346

Cloud

IMG_20181006_093540 (1)

Programming


Miscellaneous

  • Microsoft Ignite Aftermath ( Chris Pietschmann, Dan Patrick) - If you are like me and need to catch up on what append to Ignite, this post is a really good place to start as it contains a list of all the links we need.

~

Reading Notes #345

DSCF1554Suggestion of the week



Cloud



Programming



Miscellaneous

~

How to Create Your Custom Artifacts for DevTest Labs

Most of the time when we use an Azure Devtest Lab it to Test our own application. This means that will need to install them on the virtual machines, every time. To do that, we need to create a custom artifact and add it to our formulas or to our claimable VMs. Lucky for us, creating a custom artifact is much easier than you may think. In fact, this post I will show you how easy it can be.

Goal

I want to create an artifact available from a private repository (Git from dev.azure.com in this case) that will set the timezone inside the VM.

Getting started

First, let's use a section in the Azure portal that is very useful; the Get Started section. In the portal navigate to your DevTest Lab (1), and select the Getting Started option from the left menu bar (2). In this new bar scroll down to the Lear more area and select Sample artifacts and scripts (3).

GetStarted
That will open the DevTestLab artifacts, scripts and samples project from Azure on Github. Open the folder Artifact, to see the list of all the usual artifacts you find in the public repo that is available by default in the portal.

Notice how all artifacts are in their own folder. When you create a new artifact, you can always come here and pick something similar to what you are trying to do. This way, you won't start from scratch. Let's open windows-vsts-download-and-run-script. An artifact is defined in the file Artifactfile.json. This file is mandatory and cannot be renamed. You can put scripts, images, or anything else you need inside this folder.

Open the Artifactfile.json file and have a look.

ArtifactfileSample
As you can see it's a simple JSON file. In the section (A) you will define the title, description, publisher, OS and the Icon. Note that the Icon must be accessible publicly, it could be on github, a blob storage or on a website. Section (B) is to define all the parameters you may need to install your artifact on the VM. Finally In (C) it's the command to execute.

Create the Artifact

Here is the JSON for our windows-Set-TimeZone artifact. A made it very static by not passing any parameter, but in a reel situation, a timezone parameter would be better.
Artifactfile.json
{
    "$schema": "https://raw.githubusercontent.com/Azure/azure-devtestlab/master/schemas/2016-11-28/dtlArtifacts.json",
    "title": "Set TimeZone to Eastern Standard Time",
    "description": "Execute tzutil command on the VM set set the Time Zone",
    "publisher": "FBoucher",
    "tags": [
        "PowerShell"
    ],
    "iconUri": "https://raw.githubusercontent.com/Azure/azure-devtestlab/master/Artifacts/windows-run-powershell/powershell.png",
    "targetOsType": "Windows",
    "parameters": { },
    "runCommand": {
        "commandToExecute": "tzutil.exe /s \"Eastern Standard Time\""
    }
}

Create an artifact repository

For this post, I'm using Git from Azure Devops (dev.azure.com) previously named VSTS, but any private repository should works. If it's not already done create a project and go to the Repos section. Create a root folder named Artifacts or something else if you prefer. Then add a new folder for your artifact. To follow the best practices you should start with the name of your artifact by the name of the targeted OS; in my case windows-Set-TimeZone. Now add the file Artifactfile.json defined previously.

Note the url of the repository, it should be easy to get it by click on the Clone button that is on the top right of the screen.

clone

Add repository to DevTest Lab

Now we need to add this repository to our Devtest Labs. From the portal.azure.com, open the blade of your lab. From the left panel, click on Repository, then click the Add button.

CreateRepo
It's time to use the information noted previously. This is about the Repository, not the artifact.

Use the Artifact

The only thing left is to use our artifact. You can find it while creating a VM or a formula. When you have parameters define in your Artifactfile.json, the parameters will be listed in a form completly a the left.
artifactList
And if you try it, you will see that the time match the desired timezone. Here my PC is set to display with a format of 24H put it's the same... yep I'm in Eastern Standard Time.

voila

Add it to an ARM template

Doing it with the nice interface is good when you are learning. However, we all know that no DevOps will do that manually every time. So let's add our Repository to our ARM template. If you need more detail on the deployment method, I explain it in a previous post How to be efficient with our Azure Devtest Lab deployments.

When you don't know the type or the structure of a resource, you can always go in the Resource Explorer (resources.azure.com) there will be able to find your resource and see how it's defined.

ResourceExplorer
So for this post our artifactsources will look like this:
{
    "properties": {
        "displayName": "Cloud5mins",
        "uri": "https://fboucher.visualstudio.com/DefaultCollection/Cloud5minsArtifacts/_git/Cloud5minsArtifacts",
        "sourceType": "VsoGit",
        "folderPath": "/Artifacts",
        "armTemplateFolderPath": "",
        "branchRef": "master",
        "securityToken": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
        "status": "Enabled"
    }, 
    "name": "Cloud5minsRepo",
    "type": "Microsoft.DevTestLab/labs/artifactsources"
}
An artifactsources goes in the Resources list inside the Devtest Labs.

ARM
In an ARM template you have the main node Resources (A), then you will have the Lab node (B). Inside this node, you should see second resources list (C), where the Virtual Network is defined. The artifactsources should go there.

Then when you declare your formula, you just need to reference this repository, exactly like the public one.

In a video, please!

I also have a video of this post if you prefer.



Reference:

Reading Notes #344

CI-CD

Suggestion of the week


Cloud


Programming


Books

Five_cover
The Five Dysfunctions of a Team: A Leadership Fable (Patrick Lencioni) - I really enjoyed this book. The fact the first the material was passed as a story adds a lot of perspective and to our comprehension. In the last chapter the author return to the theories and gives more details. I completely devour that book; I'm looking forward to reading more.


Miscellaneous


~Enjoy


What happens when you mix Asp.Net Core, different versions of Docker and Azure for the first time

For a project I have, I wanted to validate if containers were easier to use compare to regular code with services in Azure. I needed to refresh myself with Docker, so I decide to do what I thought would be a simple test: Create an Asp.Net Core web site in a container and access it on my machine.

This post is about my journey to finally achieve this goal, as you may guess it didn't work on the first attempt.

The Goal


One reason why I was looking at containers, it's because it's supposed to be working everywhere right? Well yes but sometimes with a little of effort. The goal here is to be able to run the same container on my main PC, my surface, a Linux VM and of course in Azure.

The context


I have a different setup on my main machine and on my surface. On my PC, I'm using VirtualBox for my VMs so I'm not running Docker for windows, but Docker Toolbox. This flavor (older version) of Docker will create a VM in VitualBox instead of Hyper-V. I couldn't use Docker for Windows like on my Surface, because the two virtualization softwares don't run side by side.

I also wanted to use only tools available on each of this platform, so I decided not to use Visual Studio IDE (the big one). Moreover, I wanted to understand what was happening so I didn't want too much magic involve. Visual Studio is a fantastic tool and I love it. :)

Installing Docker


I needed to install Docker on my Surface. I downloaded Docker Community Edition (CE), and because Hyper-V was already installed everything ran smoothly. On Windows, you need to share the "C" drive from the Docker setting. However, I was getting a strange "bug" when trying to share mine. It was asking my to login with AzureAD and was ignoring my request by letting the share drive uncheckeddockerazureadcredentials.

Thanks to my new friend Tom Chantler, I did search for too long. See the thing is I'm using an AzureAD account to login, and something is not working right at the moment. As explained in Tom's post: Sharing your C drive with Docker for Windows when using Azure Active Directory, to walkaround this situation, I only had to create a new user account with the exact name as my AzureAD account, but without the AzureAD prefix (ex: AzureAD\FBoucher became FBoucher). Once that was done I could share the drive without any issue.

Let's get started with the documentation


The HelloWord container worked like a charm, so I was ready to create my Asp.Net Core website. My reflex was to go on docs.docker.com and follow the instruction from Create a Dockerfile for an ASP.NET Core application. I was probably doing something wrong, because it didn't work. So I decided to start from scratch and do every step manually... I always learn more that way.

Let's start by the beginning


Before moving everything in a container, we need a web application. This can be easily done from the terminal/ command prompt, with the commands:

dotnet new mvc -o dotnetcoredockerappservicedemo

cd dotnetcoredockerappservicedemo

dotnet restore

dotnet publish -c release -o app/ .

Here we create a new folder with a website using the mcv template. I then go in that new folder and restore the Nuget package. To test the we site locally simply use dotnet run. And finally, we build and publish the application into the subfolder app.


Moving to Docker


Now that we have our app it's time to containerize it. We need to add some Docker instruction in a dockerfile. Add a new file name dockerfile (no extension) to the root folder and copy/paste these commandes:

# dockerfile

FROM microsoft/dotnet:2.1-aspnetcore-runtime 
WORKDIR /app COPY /app /app 
ENTRYPOINT [ "dotnet" , "dotnetcoredockerappservicedemo.dll"]

To start Docker with Docker Tool just start the Docker Quickstart Terminal
This instruction will specify how to build our container. First, it will download the image microsoft/aspnetcore or microsoft/dotnet:2.1-aspnetcore-runtime. We specify the work directory, then copy the app folder to app folder inside the container. Finally, we specify the entry point of our application telling it to start with dotnet.
Like Git and it's gitIgnore file docker has the same thing with .dockerignore (no extension). Add that file into your folder to ignore the bin and obj folder.


# .dockerignore
bin\ obj\

Now that the instructions about how to build our container are completed, we can build our container. Execute the following command:

docker build -t dotnetcoredockerappservicedemo .

This will build dotnetcoredockerappservicedemo from the current folder.

Running Docker container locally


Everything is in place, the only thing missing is to run it. If you want to run it locally just go with this command:

docker run -p 8181:80 dotnetcoredockerappservicedemo

On my machine, the port 80 is always used. So I remap the port 80 to 8181, feel free to change it at your convenience. The website will be available at localhost:8181

If you are running Docker Tool (older version of Docker), you need to get the IP of your VM. To get it do

docker-machine ip

Running in the cloud


To run our container into Azure you will need to publish it to the cloud first. It could be on DockerHub or in a private registry on Azure. I decided to go with Azure. First, we need to create a registry, then publish our container.

az group create --name dotnetcoredockerappservicedemo --location eastus

az acr create --resource-group dotnetcoredockerappservicedemo --name frankContainerDemo01 --sku Basic --admin-enabled true

az acr credential show -n frankContainerDemo01

The last command az acr credential show will provides information to tag our container with our repository name and also gives us the credential to be able to push. Of course, you could go to the portal.azure.com and get the information from the Registry's Access Keys blade.

docker tag dotnetcoredockerappservicedemo frankcontainerdemo01.azurecr.io/dotnetcoredockerappservicedemo:v1

Let's connect our docker to our registry, and then push (upload) our container to Azure.


# The https:// is important...

docker login https://frankcontainerdemo01.azurecr.io -u frankContainerDemo01 -p <Password_Retreived>

docker push frankcontainerdemo01.azurecr.io/dotnetcoredockerappservicedemo:v1


Great the container is in Azure. Now let's create a quick webApp to see it. We could also use the Azure Container Instance (ACI) that would be only one command, but because the demo is a website, it won't make sense to use ACI for that.

To get an Application service, we need a Service plan, and then we will create an "empty" webapp. To do that we will specify the runtime without providing any code/binary/container. I wasn't able to create a webapp from a private Azure registry in one command, so this is why I'm doing it in two.

az appservice plan create --name demoplan --resource-group dotnetcoredockerappservicedemo --sku S1 --is-linux

az webapp create -g dotnetcoredockerappservicedemo -p demoplan -n frankdockerdemo --runtime "DOTNETCORE|2.1"

On Windows, I got the following error message: '2.1' is not recognized as an internal or external command, operable program or batch file. The PowerShell command line escape "--%" solves the problem: az --% webapp create -g dotnetcoredockerappservicedemo -p demoplan -n frankdockerdemo --runtime "DOTNETCORE|2.1"

If you check the website right now you should have page saying that the site is up but empty. Let's update the container settings with our registry and container settings.

az webapp config container set -n frankdockerdemo -g dotnetcoredockerappservicedemo --docker-custom-image-name frankcontainerdemo01.azurecr.io/dotnetcoredockerappservicedemo:v1 --docker-registry-server-url https://frankcontainerdemo01.azurecr.io --docker-registry-server-user frankContainerDemo01 --docker-registry-server-password <Password_Retreived> 

It's works of course!
final2


Conclusion


It's only four steps: create the .Net Core application, package it into a Docker container, publish our container into our Azure Registry, and create an application service base on that container. However, because all this tech are cross-platform, sometimes you get some little tiny differences between the platform, and those could become time-consuming. It was a great little project that turned out to be a lot more than expected, but I learn so much!

I'm very happy with the result... expect more of Docker in the future!


In a video, please!


I also have a video of this post if you prefer.




References

Reading Notes #343

2018-09-02_20-42-43Cloud


Programming


Reading Notes #342

Connector

Cloud


Programming

  • Writing a Blazor App (David Pine) - This tutorial shows how to build a simple blazor app...and it's NOT the hello-word or todo.

Miscellaneous



How to be efficient with our Azure Devtest Lab deployments

(Ce billet est en aussi disponible en français.)

The Devtest labs is a fantastic tool to quickly build environments for development & test purposes and for a classroom. It offers great tools to restrict the users without removing all their freedom. It will speed up the boarding, with its claimable VMs that are already created and are waiting for the user. Formulas will help ensure you that you always get the latest version of your artifact installed on those VMs. And finally, the auto-shutdown will keep your money where it should stay...in your pocket.


In this post, I will show you how to deploy an Azure Devtest Lab with an Azure Resource Manager (ARM) template, and create the claimable VMs based on your formulas in one shot.

Step 1 - The ARM template


First, we need an ARM template. You can start from scratch of course, but it may be a lot of work if you are just getting started. You can also pick one from GiHub and customize it.

What I recommended, is to create a simple Azure Devtest Lab directly from the Azure portal. Once your lab is created, go in the Automation script option of the resourcegroup and copy/paste the ARM template in your favorite text editor.
armTemplate
Now you must clean it. If you don't already know it, use the 5 Simple Steps to Get a Clean ARM Template method, it an excellent way to get started.
Once the template is clean we need to add a few things that didn't follow during the export. Usually, in an ARM template, you get one list named resources. However, a Devtest Lab also contains a list named resources but it's probably missing.
{
    "parameters": {},
    "variables": {},
    "resources": [],
}
See In the following example, I added the labs resources list just after the lab's location. This list must contain a virtualnetworks. It's also a good idea to add a schedules and a notificationChannels. Those two will be used to shut down automatically all the VMs and to send a notification to the user just before.

{
    "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
        ...
    },
    "variables": {
        ...
    },
    "resources": [
        {
            "type": "Microsoft.DevTestLab/labs",
            "name": "[variables('LabName')]",
            "apiVersion": "2016-05-15",
            "location": "[resourceGroup().location]",
            "resources": [
                {
                    "apiVersion": "2017-04-26-preview",
                    "name": "[variables('virtualNetworksName')]",
                    "type": "virtualnetworks",
                    "dependsOn": [
                        "[resourceId('microsoft.devtestlab/labs', variables('LabName'))]"
                    ]
                },
                {
                    "apiVersion": "2017-04-26-preview",
                    "name": "LabVmsShutdown",
                    "type": "schedules",
                    "dependsOn": [
                        "[resourceId('Microsoft.DevTestLab/labs', variables('LabName'))]"
                    ],
                    "properties": {
                        "status": "Enabled",
                        "timeZoneId": "Eastern Standard Time",
                        "dailyRecurrence": {
                            "time": "[variables('ShutdowTime')]"
                        },
                        "taskType": "LabVmsShutdownTask",
                        "notificationSettings": {
                            "status": "Enabled",
                            "timeInMinutes": 30
                        }
                    }
                },
                {
                    "apiVersion": "2017-04-26-preview",
                    "name": "AutoShutdown",
                    "type": "notificationChannels",
                    "properties": {
                        "description": "This option will send notifications to the specified webhook URL before auto-shutdown of virtual machines occurs.",
                        "events": [
                            {
                                "eventName": "Autoshutdown"
                            }
                        ],
                        "emailRecipient": "[variables('emailRecipient')]"
                    },
                    "dependsOn": [
                        "[resourceId('Microsoft.DevTestLab/labs', variables('LabName'))]"
                    ]
                }
            ],
            "dependsOn": []
        }
        ...

Step 2 - The Formulas


Now that the Devtest lab is well defined, it's time to add our formulas. If you had created some already from the portal, don't look for them in the template. At the moment, export won't script the formulas.

A quick way to get the JSON of your formulas is to create them from the portal and then use Azure Resources Explorer to get the code.
resourceExplorer
In a web browser, navigate to https://resources.azure.com, to open your Resource Explorer. Select the subscription, resource group, and lab that you are working on. In the node Formulas (4) you should see your formulas, click one and let's bring that JSON into our ARM template. Copy-paste it at the Resource level (the prime one, not the one inside the Lab).

Step 2.5 - The Azure KeyVault


You shouldn't put any password inside your ARM template, however, having them pre-define inside the formulas is pretty convenient. One solution is to use an Azure KeyVault.

Let's assume the KeyVault already exists, I will explain how to create it later. In your parameter file, add a parameter named adminPassword and let's reference the KeyVault. We also need to specify the secret we want to use. In this case, we will put the password in a secret named vmPassword.
    "adminPassword": {
        "reference": {
            "keyVault": {
                "id": "/subscriptions/{xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx}/resourceGroups/cloud5mins/providers/Microsoft.KeyVault/vaults/Cloud5minsVault"
            },
            "secretName": "vmPassword"
        }
    }
Now to get the password in the ARM template just use a regular parameter, and voila!

Step 3 - The ARM Claimable VMs


Now we have a Lab and the formulas, the only thing missing is the claimable VM based on the formulas. It's impossible to create in one ARM template both formulas and VMs. The alternative is to use a script that will create our VMs just after the deployment.
az group deployment create --name test-1 --resource-group cloud5mins --template-file DevTest.json --parameters DevTest.parameters.json --verbose

az lab vm create --lab-name C5M-DevTestLab -g  cloud5mins --name FrankDevBox --formula SimpleDevBox  
As you can see in the second Azure CLI command, we are creating a virtual machine named FrankDevBox based on the formula SimpleDevBox. Note that we don't need to specify any credential because everything was pre-defined in the formula. Pretty neat!

Here a part of a script that will create if it doesn't exist a KeyVault and populate it. Then it will deploy our ARM template and finally, create our claimable VM. You can find all the code on my GitHub project: Azure-Devtest-Lab-efficient-deployment-sample.

[...]

# Checking for a KeyVault
searchKeyVault=$(az keyvault list -g $resourceGroupName --query "[?name=='$keyvaultName'].name" -o tsv )
lenResult=${#searchKeyVault}

if [ ! $lenResult -gt 0 ] ;then
    echo "---> Creating keyvault: " $keyvaultName
    az keyvault create --name $keyvaultName --resource-group $resourceGroupName --location $resourceGroupLocation --enabled-for-template-deployment true
else
    echo "---> The Keyvaul $keyvaultName already exists"
fi


echo "---> Populating KeyVault..."
az keyvault secret set --vault-name $keyvaultName --name 'vmPassword' --value 'cr@zySheep42!'


# Deploy the DevTest Lab

echo "---> Deploying..."
az group deployment create --name $deploymentName --resource-group $resourceGroupName --template-file $templateFilePath --parameters $parameterFilePath --verbose

# Create the VMs using the formula created in the deployment

labName=$(az resource list -g cloud5mins --resource-type "Microsoft.DevTestLab/labs" --query [*].[name] --output tsv)
formulaName=$(az lab formula list -g $resourceGroupName  --lab-name $labName --query [*].[name] --output tsv)

echo "---> Creating VM(s)..."
az lab vm create --lab-name $labName -g  $resourceGroupName --name FrankSDevBox --formula $formulaName 
echo "---> done <--- code="">

In a video, please!


I also have a video of this post if you prefer.



Conclusion


Would it be for developing, testing, or training, as soon as you are creating environments in Azure, the DevTest Labs are definitely a must. It's a very powerful tool that not enough people know. Give it a try and let me know what do you do with the Azure DevTest Lab?


References:

  • Azure-Devtest-Lab-efficient-deployment-sample: https://github.com/FBoucher/Azure-Devtest-Lab-efficient-deployment-sample
  • An Overview of Azure DevTest Labs: https://www.youtube.com/watch?v=caO7AzOUxhQ
  • Best practices Using Azure Resource Manager (ARM) Templates: https://www.youtube.com/watch?v=myYTGsONrn0&t=7s
  • 5 Simple Steps to Get a Clean ARM Template: http://www.frankysnotes.com/2018/05/5-simple-steps-to-get-clean-arm-template.html



~

Reading Notes #341

IMG_20180815_181252Cloud


Programming


Integration


Miscellaneous


Books

  • VaporizedVaporized: Solid Strategies for Success in a Dematerialized World (Robert Tercek) - I really loved that book. In this world of digital transformation where everything goes so fast, this book explains why you should care. In fact, it asked so many good questions and related facts. I like to pretend I'm aware of the new technologies. That I'm on the edge, that I'm aware of the trending stuff... But guess what?! I got surprised, and even a bit scare at one point. This book is a must. Enjoy! ASIN: B01F9G31H8

Reading Notes #340

fan-out-fan-in

Cloud


Programming


Integration


Miscellaneous


Reading Notes #339

IMG_20180725_154113

Cloud



Programming



Data



Miscellaneous



~Enjoy!


Reading Notes #338

ChocolateyGUI_main_screen

Suggestion of the week




Cloud


    Programming


      Miscellaneous





        Reading Notes #337

        IMG_20180707_220101

        Cloud



        Programming



        Data



        Miscellaneous