How to Automatically Generate Video Sub-Title in Another Language

I recently started a French YouTube channel. Quickly, I got a message asking to add English sub-title, and got also a suggestion to leverage Azure Logic App and some Cognitive Services to help me in that task. I really liked the idea, so I gave it a shot. I recorded myself and in twenty minutes I was done. Even though, it was not the success I was hoping for, the application works perfectly. It's just that speaking in French with a lot of English technical word was a little bite too hard for the Video Indexer. However, If you are speaking only one language in your video that solution would work perfectly. In this post, I will show you how to create that Logic App with Azure Video Indexer and Cognitive Services.

The Idea


Once a video is dropped in an OneDrive folder (or any file system accessible from Azure), a Logic App will get triggered and uploads the file to the Azure Video Indexer, generate a Video Text Tracks (VTT) file, and save this new file in another folder. A second Logic App will get started and use the Translator Text API from Azure Cognitive Service to translate the VTT file, and save it into the final folder.

GenerateSubTitle


The Generation


Before getting started, you will need to create your Video Indexer API. To do this, login to the Video Indexer developer portal, and subscribe at the Video Indexer APIs - Production in the Product tab. You should then get your API keys.


ApiKeyPage

To get more detail on the subscription refer to the documentation. To know the names, parameters, code sample to all the methods available in your new API, click on APIs tab.

APIDetails

Now let's create our first Logic App. I always prefer to start with a blank template, but take what fits you. Any Online file system's trigger will do, in this case I'm using the When a file is created from OneDrive. I got some issue with the trigger. It was not always getting fired by a new file. I tried the When a file is modified trigger, but it didn't solve the problem. If you think, you know what I was doing wrong feel free to leave a comment :).

First reel action is to upload the file to the Azure Video Indexer. We can to that ery easily by using the method Upload video and and index, passing the name and content from the trigger.

Of course, the longer is the video the longer will be the process, so we will need to wait. A way to do that is by adding a waiting loop. Will use the method Get processing state from the Video Indexer and loop until the status is processed. To slow down your loop just add a wait action and set it at tree or five minutes.

When the file is completely processed, it will be time to retrieve the VTT file. This is done in two simple step. First, we will get the URL by calling the method Get the transcript URL, then with a simple HTTP GET we will download the file. The last thing we will need to do will be to save it in a folder where our second Logic App will be watching for new drop.

In the visual designer, the Logic App should look to this.

LogicAppGenerateVTT


The Translation


The second Logic App is very short. Once again, it will get triggered by a new file trigger in our OneDrive Folder. Then it will be time to call our Translator Text API from Azure Cognitive Service. That's to the great Logic App interface it's very intuitive to fill all the parameter for our call. Once we got the translation, we need to save it into our final destination.

The Logic App should look like this.

LogicAppTranslate


Conclusion


It was much easier than I expected. I really like implementing those integration projects with Logic App. It's so easy to "plug" all those APIs together with this interface. And yes like I mentioned in the introduction the result was not "great". I run test with video purely in English (even with my accent) or only in French (no mix) and the result was really good. So I think the problem is really the fact that I mix French and English. I could improve the Indexer by spending time providing files so the service could understand better my "Franglish". However, in twenty minutes, I'm really impressed by the way, in turned out. If you have idea on how to improve this solution, or if you have some questions, feel free to leave a comment. You can also watch my French YouTube video.

All the code is available online on Github - Cloud5VideoHelper.

References:



Reading Notes #310

2018

Cloud


Programming


Miscellaneous



How to Fix the [ERROR] Get-ChildItem when Deploying an Azure Resource Group in Visual Studio

Lately, I've been having some trouble when deploying from Visual Studio. First, I didn't care since I didn't have time to investigate and also because most of the time using PowerShell or Azure CLI. However, this issue was not usual of Visual Studio, so I decided to see what was the problem and try to fix it.

The Problem


In a solution, I added a simple Azure Resource Group deployment project just like this one.

simpleProject

Then when I try to right-click and do a Deploy...

Deploy

I was having this error message:

 - The following parameter values will be used for this operation:
 - Build started.
 - Project "TestARMProject.deployproj" (StageArtifacts target(s)):
 - Project "TestARMProject.deployproj" (ContentFilesProjectOutputGroup target(s)):
 - Done building project "TestARMProject.deployproj".
 - Done building project "TestARMProject.deployproj".
 - Build succeeded.
 - Launching PowerShell script with the following command:
 - 'D:\Dev\local\TestARMProject\TestARMProject\bin\Debug\staging\TestARMProject\Deploy-AzureResourceGroup.ps1' -StorageAccountName '' -ResourceGroupName 'TestARMProject' -ResourceGroupLocation 'eastus' -TemplateFile 'D:\Dev\local\TestARMProject\TestARMProject\bin\Debug\staging\TestARMProject\azuredeploy.json' -TemplateParametersFile 'D:\Dev\local\TestARMProject\TestARMProject\bin\Debug\staging\TestARMProject\azuredeploy.parameters.json' -ArtifactStagingDirectory '.' -DSCSourceFolder '.\DSC'
 - 
 - 
 - Account          : Frank Boucher
 - SubscriptionName : My Subscription
 - SubscriptionId   : xxxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
 - TenantId         : xxxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
 - Environment      : AzureCloud
 - 
 - VERBOSE: Performing the operation "Replacing resource group ..." on target "".
 - VERBOSE: 7:06:33 - Created resource group 'TestARMProject' in location 'eastus'
 - 
 - ResourceGroupName : TestARMProject
 - Location          : eastus
 - ProvisioningState : Succeeded
 - Tags              : 
 - TagsTable         : 
 - ResourceId        : /subscriptions/xxxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/resourceGroups/TestARMProject
 - 
 - Get-ChildItem : Cannot find path 
 - 'D:\Dev\local\TestARMProject\TestARMProject\bin\Debug\staging\TestARMProject\azuredeploy.json' because it does not 
 - exist.
 - At D:\Dev\local\TestARMProject\TestARMProject\bin\Debug\staging\TestARMProject\Deploy-AzureResourceGroup.ps1:108 
 - char:48
 - + ... RmResourceGroupDeployment -Name ((Get-ChildItem $TemplateFile).BaseNa ...
 - +                                       ~~~~~~~~~~~~~~~~~~~~~~~~~~~
 -     + CategoryInfo          : ObjectNotFound: (D:\Dev\local\Te...zuredeploy.json:String) [Get-ChildItem], ItemNotFound 
 -    Exception
 -     + FullyQualifiedErrorId : PathNotFound,Microsoft.PowerShell.Commands.GetChildItemCommand
 -  
 - Deploying template using PowerShell script failed.
 - Tell us about your experience at https://go.microsoft.com/fwlink/?LinkId=691202
Apparently the script is failling with Get-ChildItem because my script is missing?! I looked in the folder D:\Dev\local\TestARMProject\TestARMProject\bin\Debug\staging\TestARMProject, and indeed the files are missing! Fixing this is in fact really simple fortunately.

The Solution


The problem is very simple when Visual Studio is building the project, it doen't copies the script files in the build folder (in this case bin\Debug\Staging\). In fact, Visual Studio is doing exactly as we are telling it. Let see the build command for those files. Right-click and select Properties (or Alt+Enter) while azuredeploy.json is selected.

Properties

See the Build Action is set at None change that to Content (for all the scripts). Save and Deploy again.

enjoy

Enjoy!



Reading Notes #309

419HCloud


Programming


Databases


Miscellaneous



My new Youtube Channel

I've been blogging for about ten years with you now on this blog, and it's been a pleasure. More recently I started, quietly, a French blog named: Cloud en Français, feel free to have a look.
Today, I'm super excited to share with you my new project: A YouTube channel with original content published every week... in French! Every Wednesday I will be publishing a five-minute video to answer a problem or a question.

Come see me online, subscribe, ask questions...

Cloud en 5 minutes

youtubechannel


Je blogue depuis une dizaine d'années avec vous sur ce blogue, et c'est toujours un plaisir. Plus récemment j'ai commencé, plus silencieusement, un blogue français nommé: Cloud en Français, n'hésitez pas à jeter un coup d'œil.

Aujourd'hui, je suis super excité de partager avec vous mon nouveau projet: une chaîne YouTube avec du contenu original publié chaque semaine... en français! Chaque mercredi, je vais publier une vidéo d'environ cinq minutes pour répondre à un problème ou une question.

Venez me voir en ligne, abonnez-vous, posez des questions ...

Cloud en 5 minutes


Reading Notes #308

2017-12-02_00-10-58Suggestion of the week


Cloud


Programming


Miscellaneous


Reading Notes #307

MVIMG_20171201_131034Cloud



Programming



Miscellaneous




How to access an SQL Database from an Azure Function and Node.js

The other day, a friend asked me how he could add some functionality to an existing application without having access to the code. It the perfect case to demo some Azure Functions capability, so I jumped on the occasion. Because my friend is a Node.js developer on Linux, and I knew it was supported, I decided to try that combination. I know Node, but I'm definitely not and expert since I don't practice very often.

This post is my journey building that demo. I was out of my comfort zone, coding in Node and working on a Linux machine, but not that far... Because these days, you can "do some Azure" from anywhere.

The Goal


Coding an Azure Function that will connect to an SQL Database (it could be any data source). Using Node.js and tools available on Unbuntu.

Note: In this post, I will be using Visual Studio Code, but you could also create your function directly in the Azure Portal or from Visual Stusio.

Getting Started


If you are a regular reader of this blog, you know how I like Visual Studio Code. It's a great tool available on Mac Linux and Windows and gives you the opportunity to enjoy all its feature from anywhere feeling like if you were in your cozy and familiar environment. If VSCode is not already installed on your machine, go grap your free version on http://code.visualstudio.com.

Many extensions are available for VSCode, and one gives us the capability to code and deploy Azure Function. To install it, open VSCode and select the extension icon and search for Azure Function; it's the one with the yellow lighting and the blue angle brackets.

AzureFunctionExtension

Create the Azure Function


To get started let's great an Azure Function project. By sure to be in the folder where you wish to create your Function App. Open the Command Pallette (Ctrl + Shift + p) and type Azure Function. Select Azure Functions: Create New Project. That will add some configuration files for the Functions App.

Now Let's create a Function. You could reopen again the Command Palette and search for Azure Function: Create Function, but let's use the UI this time. At the bottom left of the Explorer section, you should see a new section called AZURE FUNCTIONS. Click on the little lighting to Create a new Function.

AzureFuncButton

After you specify the Function App name, the Azure subscription and other little essential, a new folder will be added in your folder structure, and the function is created. The code of our function is in the file Index.js. At the moment, of writing this post only Javascript is supported by the VSCode extension.

Open the file index.js and replace all its content by the following code.


var Connection = require('tedious').Connection;
var Request = require('tedious').Request
var TYPES = require('tedious').TYPES;

module.exports = function (context, myTimer) {

    var _currentData = {};

    var config = {
        userName: 'frankadmin',
        password: 'MyPassw0rd!',
        server: 'clouden5srv.database.windows.net',
        options: {encrypt: true, database: 'clouden5db'}
    };

    var connection = new Connection(config);
    connection.on('connect', function(err) {
        context.log("Connected");
        getPerformance();
    });

    function getPerformance() {

        request = new Request("SELECT 'Best' = MIN(FivekmTime), 'Average' = AVG(FivekmTime) FROM RunnerPerformance;", function(err) {
        if (err) {
            context.log(err);}
        });

        request.on('row', function(columns) {
            _currentData.Best = columns[0].value;
            _currentData.Average = columns[1].value;;
            context.log(_currentData);
        });

        request.on('requestCompleted', function () {
            saveStatistic();
        });
        connection.execSql(request);
    }


    function saveStatistic() {

        request = new Request("UPDATE Statistic SET BestTime=@best, AverageTime=@average;", function(err) {
         if (err) {
            context.log(err);}
        });
        request.addParameter('best', TYPES.Int, _currentData.Best);
        request.addParameter('average', TYPES.Int, _currentData.Average);
        request.on('row', function(columns) {
            columns.forEach(function(column) {
              if (column.value === null) {
                context.log('NULL');
              } else {
                context.log("Statistic Updated.");
              }
            });
        });

        connection.execSql(request);
    }

    context.done();
};

The code just to demonstrate how to connect to an SQL Database and do not represent the best practices. At the top, we have some declaration the used the package tedious; I will get back to that later. A that, I've created a connection using the configuration declared just before. Then we hook some function to some event. On connection connect the function getPerformance() is called to fetch the data.

On request row event we grab the data and do the "math", then finally on requestCompleted we call the second sub-function that will update the database with the new value. To get more information and see more example about tedious, check the GitHub repository.

Publish to Azure


All the code is ready; it's now time to publish our function to Azure. One more time you could to that by the Command Palette, or the Extension menu. Use the method of your choice and select Deploy to Function App. After a few seconds only our Function will be deployed in Azure.

Navigate to portal.azure.com and get to your Function App. If you try to Run the Function right now, you will get an error because tedious is not recognized.

Install the dependencies


We need to install the dependencies for the Function App, in this case tedious. A very simple way is to create a package.json file and to use the Kudu console ton install it. Create a package.json file with the following json in it:


{
    "name": "CloudEn5Minutes",
    "version": "1.0.0",
    "description": "Connect to Database",
    "repository": {
       "type": "git",
       "url": "git+https://github.com/fboucher/CloudEn5Minutes.git"
    },
    "author": "",
    "license": "ISC",
    "dependencies": {
        "tedious": "^2.1.1"
    }
}

Open the Kudu interface. You can reach it by clicking on the Function App then the tab Platform features and finally Advanced tools (Kudu). Kudu is also available directly by the URL [FunctionAppNAme].scm.azurewebsites.net (ex: https://clouden5minutes.scm.azurewebsites.net ). Select the Debug console CMD. Than in the top section navigate to the folder home\site\wwwroot. Drag & drop the package.json file. Once the file is uploaded, type the command npm install to download and install all the dependencies declared in our file. Once it all done you should restart the Function App.

Kudu

Wrapping up & my thoughts


There it is, if you go back now to your Function and try to execute it will work perfectly. It's true that I'm familiar with Azure Function and SQL Database. However, for a first experience using Ubuntu and Node.js in the mix, I was expecting more resistance. One more time VSCode was really useful and everything was done with ease.

For those of you that would like to test this exact function, here the SQL code to generate what will be required for the database side.


    CREATE TABLE RunnerPerformance(
        Id           INT IDENTITY(1,1)  PRIMARY KEY,
        FivekmTime   INT
    );

    CREATE TABLE Statistic(
        Id          INT IDENTITY(1,1)  PRIMARY KEY,
        BestTime    INT,
        AverageTime INT
    );

    INSERT Statistic (BestTime, AverageTime) VALUES (1, 1);

    DECLARE @cnt INT = 0;

    WHILE @cnt < 10
    BEGIN
    INSERT INTO RunnerPerformance (FivekmTime)
            SELECT  9+FLOOR((50-9+1)*RAND(CONVERT(VARBINARY,NEWID())));
    SET @cnt = @cnt + 1;
    END;

Video version



References


Reading Notes #306

MVIMG_20171126_090346Suggestion of the week


Cloud


Programming


Databases



Reading Notes #305

AzureDatabricks

Cloud


Programming


Miscellaneous


Reading Notes #304

IMG_20171108_160315

Cloud


Programming


Databases


Podcast


Miscellaneous



Reading Notes #303

logo-glyphSuggestion of the week

  • Writing tests in Postman (joyce) - With all the connected things and all the API in our system, this post shows a brilliant and simple way to test all those external calls.

Cloud


Programming


Data


Miscellaneous




Reading Notes #302

Autumn

Cloud


Programming


Data


Miscellaneous




Reading Notes #301

300love

Programming


Data

Miscellaneous


  • My First Year as an MVP, part 1 (Jen Kuntz) - Interesting post. It feels soooooo familiar, and yet so far now. I look forward to meeting you in March fellow Canadian MVP.


Reading Notes #300

300loveCloud


Programming


Data


Miscellaneous



Lessons Learned with Power Bi and Dynamic DAX Expressions

Since its availability, I try to use Power Bi as often as I can. It is so convenient, to build visual to explain data coming from a ton of possible data source. And it's a breeze to share it with clients, or colleagues. However, this post is not an info-commercial about Power Bi, it's about sharing some challenges I got trying to prepare a report and how I fix it.

The Data Source


The context is simple, all transactions are in one table, and I have a second table with a little information related to clients. To that, I personally like to add a calendar table, because it simplifies my life.

Datamodel

For this report, it is very important to but a slicer by Client.

The Goal


I needed to have one report that shows for every customer three different Year To Date (YTD) total. The classic YTO, a YTD but when the beginning of the years is, in fact, when the client started is enrolment, and the last one was a rolling twelve.
It looks pretty simple, and in fact, it's not that complicated. Let's examine each total formula one by one.

Classic Year To Date Total


Before we get started, it's a good practice to reuse Mesure to simplify our formula, and to explicit expression. Let's create a Measure for the Total Sales, that will be used inside our other formulas.

TotalSales = SUM('Sales'[Total])

Now the Year To Date, is simple to add by adding a New Measure and entering the formula:

YDTClassic = TOTALYDT([TotalSales], 'Calendar'[Date])

If you activate the Preview feature of Power Bi, it could be even easier. Look for the button New Quick Measure and select the Total Year To Date, fill up the form and voila!

QuickMeasure

The generated formula looks a bit different because Power Bi managed the error in h expression.

TotalYTD = 
IF(
    ISFILTERED('Calendar'[Date]),
    ERROR("Time intelligence quick measures can only be grouped or filtered by the Power BI-provided date hierarchy."),
    TOTALYDT([TotalSales], 'Calendar'[Date].[Date])
)

Anniversary Year To Date Total


I spent more time then I was expecting on that one. Because in the Online DAX documentation it is said that the formula TOTALYDT accept a third parameter to specify the end of the year. So if I had only one client, with a fix anniversary date (or a fiscal year) this formula will work assuming, the special date is April 30th.
TOTALYDT([TotalSales], 'Calendar'[Date], "04-30")
However, in my case, the ending date changes at with every client. I'm sure right now you are thinking that's easy Frank just set a variable and that it! Well, it won't work. The thing is the formula is expecting a static literal, no variable aloud even if it returns a string.
The workaround looks at first a bit hard, but it's not that complex. We need to write our own YTD formula. Let's look at the code, and I will explain it after.

Anniversary YTD = 
VAR enddingDate =   LASTDATE(Company[EnrolmentDate])
VAR enddingMonth =  MONTH ( enddingDate )
VAR enddingDay =    DAY ( enddingDate )
VAR currentDate =   MAX ( Calendar[Date] )
VAR currentYear =   YEAR ( currentDate )
VAR enddingThisYear =   DATE ( currentYear, enddingMonth, enddingDay )
VAR enddingLastYear =   DATE ( currentYear - 1, enddingMonth, enddingDay )
VAR enddingSelected =   IF ( enddingThisYear < currentDate, enddingThisYear, enddingLastYear )
RETURN
    CALCULATE (
        [TotalSales] ,    
        DATESBETWEEN(Calendar[Date],enddingSelected,currentDate)    
    )
First lines are all variable's declaration. They are not required, but I found it easier to understand when things are very explicit. Since I'm slicing my report by companies putting the LASTDATE is just a way not avoid errors. It should have only one record. Then we extract year, month, and day.
The last variable enddingSelected identify if the anniversary (the end date) is pasted or not in the curent calendar year.
The calculate function is returning the TotalSales between the last anniversary date and today.

Rolling twelve Total


For the last formula, the rolling twelve we will re-use the previous code, but in a simpler way since the end date is always yesterday.

Rolling 12 Total = 
VAR todayDate = TODAY()
VAR todayMonth =    MONTH ( todayDate )
VAR todayDay =  DAY ( todayDate ) 
VAR todayYear = YEAR ( todayDate ) 
VAR enddingLastYear =   DATE ( todayYear - 1, todayMonth, TodayDayVar +1) 
RETURN
    CALCULATE (
        [TotalSales] ,
        DATESBETWEEN( Calendar[Date], enddingLastYear, todayDate)   
    )

Wrap it up


I definitely learned a few things with that Power Bi session, but it turns out to be pretty easy. Again, leave a comment or send me an email if you have any comments or questions I will be very happy to ear from you.


References


Reading Notes #299

azure-1Cloud


Programming


Databases


Miscellaneous



How to know when an Azure Function is running in a Staging slot

I love Azure functions; I think they are very useful in many scenarios. I use them very often when I want to extend some functionality of on existing systems, to avoid having to open the Pandora box (aka. code). Recently, I was involved on a project where we used the Azure Functions as a schedule task to process (and generate) a lot of data into a database. We used VSTS to deploy the solution in a staging slot, and everything was really great. If you are interested to learn more about it, see my previous post. In fact, it was working so well that the Azure Function was running and doing it's job even while in staging slots...

In this quick post, I will share two options to prevent an Azure Function of running while in Stagging slot.

Option 1: Kudu to the rescue


If you have created a few functions, you probably already know that you have access to some Environment variables. They are very useful. I was pretty sure one exists specifying the current slot, but I didn't know the name of it. Even more, I forgot that all environment variables are displayed in the Kudu interface. D'oh! Thanks to Bruce Chen, that answered my question on StackoverFlow and help me to remember it was all there.

To get to your environment variables list, go in portal.azure.com. Open your Azure function and select the main note (1). The from the right section select the Platform features tab. Then finaly, in the Development Tools section select the Advanced tools (Kudu).

WhereisKudu

That will open the Kudu interface in a new tab. You just need to select the Environment tab and you will see them!

EnvironmentVariables

Now one more thing before we are reading to go. At the time, this post is published, Function slots are still in preview so don't forget to activate it ;)

NeedActivatePreview

Let's create a simple Azure Function to show how it works.

using System;

public static void Run(TimerInfo myTimer, TraceWriter log)
{
    log.Info($"JustDemoFunc got triggered at: {DateTime.Now}");

    var slotName = System.Environment.GetEnvironmentVariable("APPSETTING_WEBSITE_SLOT_NAME", EnvironmentVariableTarget.Process);
    if (!string.Equals("production", slotName , StringComparison.OrdinalIgnoreCase))
    {
        log.Info($"{DateTime.Now:s} Function is in stagging.");
        return;
    }

    log.Info($"{DateTime.Now:s} Function is in Production is will be executed.");
}

This is a C# timer function. Every time it will be executed (every 5 minutes) it will write to log that it got triggered. Than with System.Environment.GetEnvironmentVariable("APPSETTING_WEBSITE_SLOT_NAME", EnvironmentVariableTarget.Process) it's grabbing our slot name. By default the name is Production so if it's something else... we get out. See the log when in production slot.

JustDemoFunc got triggered at: 9/24/2017 3:28:16 PM
2017-09-24T15:28:16 Function is in Production is will be executed.
Function completed (Success, Id=###, Duration=21ms)
And now the same code, but running in another slot, in this case named Stagging.
JustDemoFunc got triggered at: 9/24/2017 3:25:29 PM
2017-09-24T15:25:29 Function is in stagging.
Function completed (Success, Id=###, Duration=161ms)

Option 2: Sticky Setting it is


Another option is also possible using Application Settings. For that, simply add a new Setting, in this case named: ShouldItRun. Now let's jump into the code.

using System;
using System.Configuration;

public static void Run(TimerInfo myTimer, TraceWriter log)
{
    log.Info($"JustDemoFunc got triggered at: {DateTime.Now}");

    if (!Convert.ToBoolean(ConfigurationManager.AppSettings["ShouldItRun"]))
    {
        log.Info($"{DateTime.Now:s} Function is in stagging.");
        return;
    }

    log.Info($"{DateTime.Now:s} Function is in Production is will be executed.");
}

To be able to read your application setting you will need using System.Configuration. Then a quick call to ConfigurationManager.AppSettings["ShouldItRun"] will return the value of your setting. Once again, see the log from the production slot.

JustDemoFunc got triggered at: 9/24/2017 4:03:43 PM
2017-09-24T16:03:43 Function is in Production is will be executed.
Function completed (Success, Id=###, Duration=30ms)

And in the staging slot.

JustDemoFunc got triggered at: 9/24/2017 4:02:21 PM
2017-09-24T16:02:21 Function is in stagging.
Function completed (Success, Id=###, Duration=8ms)

I hope you enjoy this little quick hack.


Watch the video





Reading Notes #298

IMG_20170919_161146Cloud


Programming


Miscellaneous

  • #222: Patrick Lencioni—Getting Hiring Right (EntreLeadership Team) - It was my first episode of this podcast, but definitely not the last one. Very interesting speakers... nice books referenced... loved it.
  • Why Your Boss Makes You Punch a Time Clock (Suzanne Lucas Suzanne Lucas is a freelance writer who spent 10 years in corporate human resources, where she hired, fired, managed the numbers, and double-checked with t) - Most of us have to do timesheets... I'm sure that at one point, you asked yourself the reason about it. It's time read an answer, in this post.


Reading Notes #297

Jekyll_AppService

Suggestion of the week

  • How to uninstall Scrum (Erwin Verweij) - When you will read that post (because you must read it... Seriously), you will smile, giggle and maybe even laugh.


Cloud


Programming


Miscellaneous





Lessons learned when deploying multiple databases to Azure with VSTS

It's had been a while since I worked into Visual Studio Team Services (VSTS), and it was a real pleasure to get back in that area. For the solution I was working on, we need to keep the current database up and running while deploying a new version. For this purpose, we decided to append the release number to database name (ex: MyDatabase363). In our Build and Release processes, we needed to identify which databases are from the last release. In this post, I will show what I did using an inline PowerShell script to get that number and set it as an environment variable so it can be accessible by other tasks.

To get started let's add a Azure PowerShell task to our build definition. In this post, I use a build process but of course this is also valid for release process. To find the task quickly, use the search text box. I will add two of those, one to get the number, the second to validate that this value is now set as an environment variable and readable from other tasks.
AddPowerShellTask

Now it's time to set the first task. Fill-out all the properties and select Inline Script as the Script Type. It should look like this.

InlineScript

Let's examine the code.
Get last Release Number
$matchingResources = Find-AzureRmResource -ResourceNameContains "mydatabase" -ResourceType "Microsoft.Sql/servers/databases"

$lastRelease = 0

ForEach($resource in $matchingResources)
{
    if ($resource.ResourceName -match '(\d)+$') {
        if($lastRelease -lt $matches[0]){
            $lastRelease = $matches[0]
        }
    }
}
Write-Output "The last release number is:  $lastRelease"
Write-Output ("##vso[task.setvariable variable=lastReleaseNumber;]$lastRelease")
On the first line, I use the Azure PowerShell commandlet Find-AzureRmResource1 to get an array of all the databases currently online in my resource group that contains a specific string. In this case, it's the name of the database without the release number. Then I will loop through all returned resources and using a very simple Regex to extract the release number and keep the biggest one (the last release).

To close that script we have two outputs. The first one is to give feedback in the logs, because it's always good to have some information there. The second one look more complicated, but if you split it, it's easier to see what's happening. In fact, we are producing a VSTS (previously called Visual Studio Online this is why it's VSO) command to initialize a variable ##vso[task.setvariable variable=lastReleaseNumber;] And of course, assign to it our last release number $lastRelease

To validate that we really successfully found our last release number and assigned it to a variable, let's try to read it back but from another step. That will be easily done this code in the other step created:
Validate the last Release Number
$number = $env:lastReleaseNumber

Write-Output "Confirmation, the last Release Number is:  $number "
The only thing missing before we can run our test is to create that environment variable. To to it simply go in the Variables tab and add it there.

env-variable

It's all set, run the build and you should see something similar in your logs.

trace



References




Reading Notes #296

IMG_20170910_134750

Cloud


Databases


Miscellaneous