How I use Azure Logic App, API App and Function App in my life

Why should we do things over and over? I know I don't want to lose my time doing so. Every time I see myself doing something I already did, I look for optimization. Sometimes, it's just little things like trying a different path to get from the train station to the office, and the other time, it's like drinking the rainbow juice directly from the source.

In this post, I will share with you how I optimized a three hours process into a two... minutes automated customizable solution using Azure Logic App and API App.

Quick Context

For the past two hundred fifty something weeks every Monday I share my reading notes. Those are articles that I read during the previous week with a little comment; my 2 cents about it. I read all those books, articles and posts on my e-book reader; In this case a Kindle whitepaper. I use an online service call Readability that can easily clean (remove everything except the text and useful images) the article and send it to my e-book reader. All the articles are kept as a bookmark list.

SendToKindle

When I have time I read what's on my list. Each time I'm done, I add a note ended by tags between square brackets, those will be use later for research and filter.

OnlyNotewithTags

Every Sunday, I extract the notes, then search in Readability to find back the URL and build with all this my post for the next morning. A few years ago, I did a first optimization using a Ruby script and I blogged about it here. While it's been working very well all those years, I wanted to change it so I do not have to install Ruby and the Gems on every machine I use.

Optimization Goals

Here is the list of things I wanted for this new brew of the Reading notes post generator:
  • No install required
  • not device or service couple.
  • generate json verrsion for future purpuse
  • cheap

Optimization Plan

Azure Logic App is the perfect candidate for this optimization because it doesn't require any local installation. Since it a flow between connectors any of them can be changed to please the user. In fact, that was the primary factor that made me picked Azure Logic App. For example, i the current solution, I'm using OneDrive as an initial drop zone. If you your environment you would prefer using DropBox instead of OneDrive, you just need to switch connectors, and nothing else will be affected. Another great advantage is that Azure Logic App is part of the App service ecosystem so all those components are compatible.

Here is the full process plan, at the high level.

ReadingNoteLogicApp_blog

  1. Drop the My Clippings.txt file in an OneDrive folder.
  2. Make a copy of the file In Azure Blob Storage, using the Blob Storage built-in connector.
  3. Parse the My Clippings.txt file to extract all the clippings (notes) since last extraction, using my custom My Clippings API App.
  4. For each note,
    1. Get the URL where that post is coming from, using my custom Readability Api App.
    2. Extract the tags and category (first tag is used as a category), using my custom Azure Function.
    3. Serialize the note with all the information in json.
    4. Save to Azure Blob Storage the json file in a temporary container.
  5. Generate a summary of all json files using my last custom ReadingNotes API App. It also saved a json and Markdon files of the summary.
Note: The summary could be published directly, but I decided to keep that last step manual, so I can check for typos and Grammar.

Let's see more in details

Many great tutorials are already available about how to create Azure Logic App or Azure API App, so in this post, I won't explain how to create them, but mostly share with you some interesting snippets or gotchas I encounter during my journey.

In the next screen capture, you can see all the steps contained in my two Logic App. I decided to slip the process because many tasks needed to be done for every note. The main loop (on the left) fetches the notes collection and generates the output. For every note, it will call the Child Logic App (on the right).


logicapp-Overview

The ReadingNotes Builder contains:
  1. Initial Trigger when a file is created in an OneDrive folder.
  2. Create a copy of the file in an Azure Blob storage.
  3. Delete the file in OneDrive.
  4. Get the configuration file from Azure blob storage.
  5. Call the API App responsible for parsing the file and extract the recent clippings.
  6. Call the child Logic App for every clipping returned in the previous steps (foreach loop).
    • A. Trigger of the Child Logic App.
    • B. Call the API App responsible for searching in the online bookmark collection and return the URL of the article/ post.
    • C. Call Azure Function App responsible for extracting tags from the note.
    • D. Call Azure Function App responsible for converting the object in json.
    • E. Saves the json object in a file in Azure blob storage.
    • F. Gets the updated configuration file.
    • G. Call Azure Function App responsible for keeping the latest note date.
    • H. Update the configuration with the latest date.
    • I. Return an HTTP 200 code, so the parent Logic App knows the work is done.
  7. Call the API App responsible for building the final markdown file.
  8. Save the markdown file to an Azure blob storage that was returned at the previous step.
  9. Update the config final.

As an initial trigger, I decided to use OneDrive because it's available from any devices, from anywhere. I just need to drop the MyClippings.txt file in the folder, and the Logic App will start. Here, how I configured it in the editor:


logicappMain-InitialTrigger

MyClippings API App


Kindle's notes and highlights are saved in a local file named My Clippings.txt. To be able to manipulate my notes, I needed a parser. Ed Ryan created the excellent [KindleClippings][KindleClippings], an open-source project available on github. In the actual project, I only wrapped that .Net Parser in an Azure API App. Wrapping the API in an API App will help me later to manage security, metrics, and it will simplify the connection in all other Azure solutions app.

I won't go into the detail on how to create an Azure API App, a lot of great documentation is available online. However, you will need at least the Azure .Net SDK 2.8.1 to create an API App project.

CreateAPIApp_step1


Swagger API metadata and UI are a must so don't forget to un-comment that section in the SwaggerConfig file.


CreateAPIApp_step2


I needed an array of notes taken after a specific date. All the heavy work is done in [KindleClippings][KindleClippings] library. The ArrayKindleClippingsAfter method gets the content of the My Clippings.txt file (previously saved in Azure blob storage), and pass it to the KindleClippings.Parse method. Then using Linq, only return the notes taken after the last ReadingNotes publication date. All the code for this My Clippings API App is available on github, but here the method:


public Clipping[] ArrayKindleClippingsAfter(string containername, string filename, string StartDate)
{
    var blobStream = StorageHelper.GetStreamFromStorage(containername, filename);
    var clippings = KindleClippings.MyClippingsParser.Parse(blobStream);
    var result = (from c in clippings
                    where c.DateAdded >= DateTime.Parse(StartDate)
                    && c.ClippingType == ClippingTypeEnum.Note
                    select c as KindleClippings.Clipping).ToArray<Clipping>();
    return result;
}


Once the Azure API App is deployed in Azure it could be easily called from the Azure Logic App.


AddCustomAPIApp


A great thing to notice here is that because Azure Logic App is aware of the required parameters and remembers all the information from the previous steps (show in different color), it will be a very simple to configure the call.


logicappMain-GetTheClippings

Azure Logic App calling another Azure Logic App


Now that we have an array of note, we will be able to loop through each of them to execute other steps. When I wrote the App, the only possibility way to execute multiple steps in a loop, was to call another Azure Logic App. The child Logic App will by trigger by an HTTP POST and will return an HTTP Code 200 when it's done. A json schema is required to define the input parameters. A great tool to get that done easily is http://jsonschema.net.


logicappSub-InitialTrigger

Readability API App

Readability is an online bookmark service that I'm using for many years now. It offers many API to parse or search article and bookmarks. I found and great .Net wrapper on github called CSharp.Readability that was written by Scott Smith. I could have called directly the Readability from Logic App. However, I needed a little more so I decided to use Scott's version as my base for this API App.
From the clipping collection returned at the previous step, I only have the title and need to retrieve the URL. To do this, I added a recursive method call SearchArticle.

private BookmarkDetails SearchArticle(string Title, DateTime PublishDate, int Pass)
{
    var retryFactor = 2 * Pass;
    var fromDate = PublishDate.AddDays(-1 * retryFactor);
    var toDate = PublishDate.AddDays(retryFactor);
    var bookmarks = RealAPI.BookmarkOperations.GetAllBookmarksAsync(1, 50, "-date_added", "", fromDate, toDate).Result;
    var result = from b in bookmarks.Bookmarks
                    where b.Article.Title == Title
                    select b as BookmarkDetails;
    if (result.Count() > 0)
    {
        return result.First<BookmarkDetails>();
    }
    if (Pass <= 3)
    {
        return SearchArticle(Title, PublishDate, Pass + 1);
    }
    return null;
}

Azure Function: Extract tags

While most of the work was done in different API, I needed little different tools. Many possibilities but I decided to take advantage of the new Azure Function App. They just sit there waiting to be use! The ReadingNotes Builder uses three Azure Function App, let me share one of them: ExtractTags. An interesting part with function is that you can configure them to get triggered by some event or to act as Webhook.

CreateFunctionApp

To create a Function App as Webhook you can use one of the templates when you create the new Function. Or from the code editor in the Azure Portal, you can click on the Integrate tab and configure it.


SetWebHookonFunctionApp


Once it's done you are ready to write the code. Once again, this code is very simple. It started by validating the input. I'm expecting a Json with a node called note, then extract the tags from it and return both parts.


public static async Task<object> Run(HttpRequestMessage req, TraceWriter log)
{
    string tags;
    string cleanNote;
    string jsonContent = await req.Content.ReadAsStringAsync();
    dynamic data = JsonConvert.DeserializeObject(jsonContent);
    if (data.note == null ) {
        return req.CreateResponse(HttpStatusCode.BadRequest, new {
            error = "Please pass note properties in the input object"
        });
    }
    string note = $"{data.note}";
    int posTag = note.LastIndexOf('[')+1;
    if(posTag > 0){
        tags = note.Substring(posTag, note.Length - posTag-1);
        cleanNote = note.Substring(0,posTag-1);
    }
    else{
        tags = "";
        cleanNote = note;
    }
    return req.CreateResponse(HttpStatusCode.OK, new {
        tags = $"{tags}",
        cleanNote = $"{cleanNote}"
    });
}


Now, to call it from the Azure Logic App, You will need first to Show Azure Function in the same region to see your Function App. And simply add the input; in this case like expected a json with a node called note.


logicappSub-FunctionCall

What's next

Voila! This is a simple but real application. While I shared only a part of the code in this post, all of it is available on github. I also did a presentation at DevTeach and other details are present in those slides. Using Azure Logic App to build this application was really interesting, and easy. Now that some pieces are in place (up there), I will be able to grow my environment by adding more functionalities, an interface, more security, but that's for another post...



References

Reading Notes #251

CAW8uq8UwAA_ffsCloud


Programming


Miscellaneous



Reading Notes #250

250Notes

 

Suggestion of the week


Cloud


Programming



Reading Notes #249

logicappSuggestion of the week


Cloud


Programming


Databases


Miscellaneous




Reading Notes #248

imageProgramming


Databases


Miscellaneous


Reading Notes #247

23Cloud


Programming


Miscellaneous

Reading Notes #246

IMG_20160826_115405Cloud


Programming


Databases


Miscellaneous


Reading Notes #245

Cloud


Miscellaneous


Reading Notes #244

cakeWin10Cloud


Programming


Miscellaneous



Everything we Should Know About the new Azure Usage And Billing (AUBI) Portal

(Ce billet en aussi disponible en français.)

If one image is worth a thousand words, then it's incredible the amount of information you have in Azure Usage And Billing (AUBI). This portal is a open-source project that has been announced a few weeks ago. In this post, I will share my first impressions about it.

Portal

The project is still young, but every alive. When I installed it, I had one or two minor issues, but by the time I wrote this post all of them were already fixed.

Where it is?

The Azure Usage And Billing site is not a website like portal.azure.com; it consists of a solution you need to deploy in an Azure subscription. It doesn't require to be the subscription you wish to monitor, just a subscription you have access. The solution contains: two web sides with both Application Insights and one also with webjobs, an SQL Database, a storage account and you will also need to deploy a Power BI report.

resourcesgroup

All of it can be easily deployed using the PowerShell script and Azure Resource Manager (ARM) template included. Only a few manual steps will be required. Hopefully, a very clear and completed documentation is available in video or written. Both present on the Github project page.

What can I do with it?

Once fully deployed, you will need to navigate to your instance of the Registration portal (ex: http://frankregistrationv12.azurewebsites.net) and register all the subscriptions you want. After the webjobs are finished bringing all the data, they will all be available in the Power BI Reports.
Power BI does an incredible work by showing all the information about your subscription(s). A very useful point here is that all information present in the dashboard is interactive! Whatever you select simply one or many subscriptions or only a specific category of Azure service, all the other tiles will be automatically adjusted.

Aubi_800

What's Next?

If it's not already done, I highly recommend installing the AUBI portal and start enjoying the detail of all that information available to you without any effort, and presented in such a beautiful way. For all the details about the prerequisites or the install procedure got to the Github project page.



Reference:



Reading Notes #243

valutoBusinessSuggestion of the week


Cloud


Programming


Miscellaneous



Reading Notes #242

mapCloud


Programming

  • Exploring dotnet new with .NET Core (Scott Hanselman) - I discover the different types in dotnet new command during Julie Lerman's talk at DevTeach and now this post shows a list of incredible opportunities.

Miscellaneous


Reading Notes #241

IMG_20160711_083348Cloud


Programming


Miscellaneous


Create and Deploy .NET Core WebApp to Azure from Linux

(Ce billet en aussi disponible en français.)

The other day, I was glued to my PC, and I had spare time (yah, I know very unusual). Since .Net Core 1.0 was just released few days before, I decide to give it a try. To add an extra layer of fun in the mix, I decided to do it from my Ubuntu VM. In less than 15 minutes, everything was done! I was so impressed I knew I needed to talk about it. That's exactly what this post is about.

The preparation

Before we get started, it's important to know which version of Ubuntu you are using, because some commands will be slightly different. To know the version you are running you simply need to click the gear in the top right of the desktop screen and select About this Computer. In my case, since I'm using Ubuntu 14.04 LTS, I will be using command related to this version. If you are using a different version, please refer to .NET Core documentation.

ubuntu_version

Now we need to install .Net Core. Nothing more easy. Open a Terminal (Ctrl+Alt+T) and type those three commands:

# Setup the apt-get feed adding dotnet as repo
sudo sh -c 'echo "deb [arch=amd64] https://apt-mo.trafficmanager.net/repos/dotnet/ trusty main" > /etc/apt/sources.list.d/dotnetdev.list'
apt-key adv --keyserver apt-mo.trafficmanager.net --recv-keys 417A0893

# Get the latest packages
apt-get update

# Install .NET Core SDK
sudo apt-get install dotnet-dev-1.0.0-preview2-003121
Once it's all done you can type dotnet --info and you should see something like that.

dotnet_info


Create the Local WebApp

From the same Terminal we will create an empty folder for our solution and move into it. Execute these commands.
mkdir demodotnetweb
cd demodotnetweb
We now want to create our new web project. This is done by the command dotnet new, but we need to specify the type otherwise it will create a console application.
dotnet new -t web
Now to download all the references (nuget packages) of our project required, execute the following command:
dotnet restore
Base on the speed of your Internet connection and how many dependencies are required, this can take from few seconds to more than one minute.
To test if our solution works locally type the command:
dotnet run
That will compile our solution and start an AspNetCore Internal hosting. Launch a web browser and go to http://localhost:5000 to see the App in action.

dotnetcore_localhost

Deploy to Azure

To deploy our new project to the cloud, we will use the continuous deployment capability of Azure. First, navigate to portal.azure.com and create a Web App.

create_webApp

Once the the application is created, you should see the This web app as been created message when you navigate to the [nameofWebApp].azurewebsites.net

successfully_created

It's now time to add a local Git repository. In the WebApp Settings select Deployment source. Click on the Configure Required Settings, then select the Local Git Repository option.

add_source_control

After setting the user credential for the repository, we can get the URL from the Essential section.

repourl

Back to our Ubuntu Terminal, we will have to execute these commands:

# create a git repository
git init
# commit all files
git commit -m "Init"

# Add the remote repository
git remote add azure https://username@demowebcore.scm.azurewebsites.net:443/demowebcore.git

# Push the code to the remote
git push azure master
After a minute or so you should have your WebApp online!

dotnetcore_azure


Voila! That was fun!.

Reading Notes #240

Cloud


Programming


Data


Miscellaneous


Reading Notes #239

2016-07-04_05-54-55Cloud


Programming


Miscellaneous



Reading Notes #238

docker_ascii_artCloud


Programming


Miscellaneous

  • Back to my core - In this post, Darrel shares the beginning of a new adventure... Congradulation to you and looking forward to reading again from you. Microsoft gained a really good developer.


Reading Notes #237

2016-06-20_06-39-16Cloud


Programming


Miscellaneous



Reading Notes #236


WhyAzureCLISuggestion of the week


Cloud


Programming


Miscellaneous

  • Happiness is DevOps’ Cornerstone (Alexandre Brisebois) - Interesting post that asks a lot of questions... I would like to see some graph or pie chart about our answers.
  • 5 Habits that Help Code Quality (View all posts by Erik Dietrich) - Yet another post about how to code better, but this one is refreshing. It explains why the opposite would be harmful and also give us a training plan for better chance of success.
  • What’s in your highlights folder? (Marc Gagne) - Because life is not only mistakes and bad luck.Here good tips to help you giving some sunshine into your life when needed.

Automating Docker Deployment with Azure Resource Manager

Recently, I had to build a solution where Docker container were appropriate. The idea behind the container is that once there are built you just have to run it. While it's true, my journey was not exactly that, nothing dramatic, only few gotchas that I will explain while sharing my journey.

The Goal

The solution is classic, and this post will focus on a single Virtual Machine (VM). The Linux VM needs a container that automatically runs when the VM starts. Some files first download from a secure location (Azure blob storage) then passed to the container. The solution is deployed using Azure resources manager (ARM). For the simplicity, I will use Nginx container but the same principle applies to any container. Here is the diagram of the solution.

Docker-in-Azure2

The Solution

I decided to use Azure CLI to deploy since this will be also the command-line used inside the Linux VM, but Azure PowerShell could do the same. We will be deploying an ARM template containing a Linux VM, and two VM-Extension: DockerExtension and CustomScriptForLinux. Once the VM is provisioned, a bash script will be downloaded by CustomScriptForLinux extension from the secure Azure blob storage myprojectsafe, then executed.

Reading Notes #235

ImaginationSuggestion of the week


Cloud