Passing a file from an Azure Logic App to a Web API

(Ce billet en aussi disponible en français.)

Logic App is one of my favorite tools in my cloud toolbox. It's very easy to connect things together, something without even coding! Last week, I needed to pass a file from a SharePoint folder to an API. I moved files tons of times using Azure Logic Apps, but this time something was not working. Thanks to Jeff Hollan (@jeffhollan) who put me on the good path by giving me great advice, my problem was quickly solved. In this post, I will share with you the little things that make all the difference in this case.

The Goal


When a file is created in a SharePoint folder, an Azure Logic App needs to get triggered and passes the file name and its content to a Web Api. In this case, I'm using Sharepoint, but it will work the same way for all folder connector types (ex: DropBox, OneDrive, Box, GoogleDrive, etc.)

Note:
In this post, I'm using a SharePoint Online, but the same thing could perfectly work with a SharePoint on premise or in a Virtual machine. In this situation, On-premise Data Gateway needs to be installed locally. It's very easy to do, just follow the instruction. One gotcha... You MUST use the same Microsoft account of type "work or school" to connect to the Azre.portal.com and installing the On-premise Data Gateway.

The Web API App


Let's start by building our Web API. In Visual studio create a new Web API App. If you would like to have more details about how to create one see my previous post. Now, create a new controller and add a new function UploadNewFile with the following code:

[SwaggerOperation("UploadNewFile")]
[SwaggerResponse(HttpStatusCode.OK)]
[Route("api/UploadNewFile")]
[HttpPost]
public HttpResponseMessage UploadNewFile([FromUri] string fileName)
{
    if (string.IsNullOrEmpty(fileName))
    {
        return Request.CreateResponse(HttpStatusCode.NoContent, "No File Name.");
    }

    var filebytes = Request.Content.ReadAsByteArrayAsync();

    if (filebytes.Result == null || filebytes.Result.Length <= 0)
    {
        return Request.CreateResponse(HttpStatusCode.NoContent, "No File Content.");
    }

    // Do what you need with the file.

    return Request.CreateResponse(HttpStatusCode.OK);
}

The tag [FromUri] before the parameter is just a way to specify where that information is coming from. The content of the file couldn't be passed in the querystring, so it will be passed through the body of our HTTP Request. And it will be retrieved with the code Request.Content.ReadAsByteArrayAsync(). If everything works we return a HttpResponseMessage with the HttpStatusCode.OK otherwise some message about the problem. You can now publish your Wep API App.

In order to be able to see our WebAPI App from our Logic App, one more thing needs to be done. From the Azure portal, select the freshly deployed App Service and from the options section (the left area with all properties) select CORS, then type * and save it.

changeCORS

The Logic App


Assuming that you already have a SharePoint up and running, let's create the new Logic App. Once the Logic App is deployed click the edit button to go in the designer. Select the Blank template. In this post, I need a SharePoint trigger when a New File is created. At this point, you will be asked to answer a few questions in order to create your SharePoint connector. Once it's done select the folder where you will be "dropping" your files.

Now that the trigger is done, we will add our first (an only) action. Click Add Step. Select available functions, then our App Service and finally the method UploadNewFile.
SelectApiApp
Thanks to swagger, Logic App will be able to generate a parameter form for us. Put the filename in the Filename parameter textbox. The Logic App should look like this.

FullLogicApp

The last thing we need to do is specify to our Logic App to pass the file content to the body of the HTTP request to the API. Today, it's not possible to do it using the interface. As you probably know, behind that gorgeous sits a simple json document, and it's by editing this one that we will be able to specify how to pass the file content.

Switch to Code view, and find the step that calls our API App. Simply add "body": "@triggerBody()" to that node. That will tell Logic App to bind the body of the trigger (the file content) and pass-it to the body of our web request. The code should look like this:

"UploadNewFile": {
    "inputs": {
        "method": "post",
        "queries": {
        "fileName": "@{triggerOutputs()['headers']['x-ms-file-name']}"
        },
        "body": "@triggerBody()",
        "uri": "https://frankdemo.azurewebsites.net/api/UploadNewFile"
    },
    "metadata": {
        "apiDefinitionUrl": "https://frankdemo.azurewebsites.net/swagger/docs/v1",
        "swaggerSource": "website"
    },
    "runAfter": {},
    "type": "Http"
}

You can now save and exit the edit mode. The solution is ready, enjoy!

References:

Reading Notes #273

Frank_AzureFunction-2Cloud


Programming


Miscellaneous



Secure a Asp.Net MVC multi-tenant Power Bi Embedded hosted in an Azure WebApp

Note: This post was originally published on Microsoft MVP Award blog, as part of the Technical Tuesday series.

Power Bi gives us the possibility to create amazing reports. Even if it's great to be able to share those reports from the very secure Power Bi portal sometimes we need to share them inside other applications or websites. Once again, Power BI doesn't disappoint us by providing Power BI Embedded. In this post, I will explain how to use Power Bi Embedded and make it secure so each tenant can only his data.

The Problem

Despite many online exist that explain how to use filters to change the witch is visible in our reports, filters can easily be changed by the user. Even if you hide the filter panel, those setting could easily be modified using JavaScript... Therefor, it's definitely not the best way to secure private information.

The Solution

In this post, I will be using roles to limit the access the data. The well knew the database Adventure Works will be used to demonstrate how to partition the data. In this case will be using the customer table.

In Azure

Open the Azure portal to create a Power BI Embedded component. Of course in a real project, it would be better to create it in an Azure Resource Management (ARM) template, but to keep this post simple we will create it with the portal. Click on the big green "+" at the top left corner. In the search box type powerbi, and hit Enter. Select Power BI Embedded in the list and click the Create button. Once it's created go to the Access Keys property of the brand-new Power BI Workspace Collection and take note of Key. We will need that key later to upload our Power BI report.

CreateWorkSpaceCollection

For this demo, the data source will be Adventure Works in an Azure Database. To do it simply click again the "+" button and select Database. Be sure to select Adventure Works as the source if to reproduce this demo.

createDB


In Power BI Desktop

Power BI Desktop is a free tool from Microsoft that will help us to create our report; it can be download here.
Before we get started, two options need to be modified. Go in the File menu and select Options and Settings, then Options. The first onr, is in the section (tab) Preview Features; check the option: Enable cross filtering in both direction for DirectQuery. The second is in the section DirectQuery, check the option Allow unrestricted measures in DirectQuery mode. It's a good idea to restart Power BI Desktop before continuing.

powerbioptions

To create our reports we first need to connect to our datasource, in this case our Azure Database. Click the Get Data button, then Azure and after that Microsoft Azure SQL Database. It's important to be attentive on the type of connection Import or Direct Query, because you won't be able to change it after. You will need to rebuild your report from scratch. For this case select DirectQuery.
This chart will be displaying information about invoice detail. Be sure to include the table that will be used for your role. In this case, I will be using Customer. Each customer must see only their invoices.

 tables

The report will contain two charts: the left one is a bar chart where you see the invoice historic, the right one is a pie chart that shows how products in the invoice(s) are distributed by category.
Note: in the sample database all customer have only one invoice and hey are all at the same date

chart_noRole

Now we need to create our dynamic Role. In the Modeling tab click on Manage Roles and create a CustomerRole mapping the CompanyName of the customer table to the variable USERNAME()

genericRole

Of course, to test if our charts are really dynamics, create other roles, and give them specific values ex: "Bike World" or "Action Bicycle Specialists". To visualize your report as those user, simply click on the View as Roles, in the Modeling tab, and select the role you want.

ViewAs

See how the charts look when see from "Action Bicycle Specialists".

chart_withRole

The report is now ready. Save it and we will need it soon.


Powerbi-cli

To upload our report in our Azure Workspace Collection, I like to use PowerBI-CLI because it runs everywhere, thanks to Node.js.
Open a command prompt or Terminal and execute the following command to install PowerBI-CLI:
npm install powerbi-cli -g
Now if you type 'powerbi' you should have the powerbi-cli help display.

powerbicli

It's time to use the access key we got previously, and use it in this command to create a workspace in our workspace collection.

//== Create Workspace ===========
powerbi create-workspace -c FrankWrkSpcCollection -k my_azure_workspace_collection_access_key

Now, let's upload our Power BI report into Azure. Retrieve the workspace ID returned by the previous command and pass it as the parameter -w (workspace).

//== Import ===========
powerbi import -c FrankWrkSpcCollection -w workspaceId -k my_azure_workspace_collection_access_key -f "C:\powerbidemo\CustomerInvoices.pbix" -n CustomerInvoices -o

Now we will need to update the connectionstring of our dataset. Get his ID with the following command:

//== Get-Datasets ===========
powerbi get-datasets -c FrankWrkSpcCollection -w workspaceId -k my_azure_workspace_collection_access_key 

Now update the connectionstring, passing the datasetId with the parameter -d:

//== update-connection ===========
powerbi update-connection -c FrankWrkSpcCollection -w workspaceId -k my_azure_workspace_collection_access_key -d 01fcabb6-1603-4653-a938-c83b7c45a59c -u usename@servername -p password


In Visual Studio

All the PowerBi Embeded part is now completed. Let's create the new Asp.Net MVC Web Application. A few Nuget packages are required, be sure to have those versions or newest:
  • Microsoft.PowerBI.AspNet.Mvc version="1.1.7"
  • Microsoft.PowerBI.Core version="1.1.6"
  • Microsoft.PowerBI.JavaScript version="2.2.6"
  • Newtonsoft.Json version="9.0.1"
By default Newtonsoft.Json is already there but needs an upgrade.
Update-Package Newtonsoft.Json
And for the Microsoft.PowerBI one, an install command should take care of all the other dependencies.

Install-Package Microsoft.PowerBI.AspNet.Mvc

We also need to add all the access information we previously used in our powerbi-Cli into our application. Let's add them in the web.config.

...
<appSettings>
    <add key="powerbi:AccessKey" value="my_azure_workspace_collection_access_key" />
    <add key="powerbi:ApiUrl" value="https://api.powerbi.com" />
    <add key="powerbi:WorkspaceCollection" value="FrankWrkSpcCollection" />
    <add key="powerbi:WorkspaceId" value="01fcabb6-1603-4653-a938-c83b7c45a59c" />
</appSettings>
...

Here the code of the InvoicesController:

using System;
using System.Configuration;
using System.Linq;
using System.Web.Mvc;
using demopowerbiembeded.Models;
using Microsoft.PowerBI.Api.V1;
using Microsoft.PowerBI.Security;
using Microsoft.Rest;
namespace demopowerbiembeded.Controllers
{
    public class InvoicesController : Controller
    {
        private readonly string workspaceCollection;
        private readonly string workspaceId;
        private readonly string accessKey;
        private readonly string apiUrl;
        public InvoicesController()
        {
            this.workspaceCollection = ConfigurationManager.AppSettings["powerbi:WorkspaceCollection"];
            this.workspaceId = ConfigurationManager.AppSettings["powerbi:WorkspaceId"];
            this.accessKey = ConfigurationManager.AppSettings["powerbi:AccessKey"];
            this.apiUrl = ConfigurationManager.AppSettings["powerbi:ApiUrl"];
        }
        private IPowerBIClient CreatePowerBIClient
        {
            get
            {
                var credentials = new TokenCredentials(accessKey, "AppKey");
                var client = new PowerBIClient(credentials)
                {
                    BaseUri = new Uri(apiUrl)
                };
                return client;
            }
        }
        public ReportViewModel GetFilteredRepot(string clientName)
        {
            using (var client = this.CreatePowerBIClient)
            {
                var reportsResponse = client.Reports.GetReportsAsync(this.workspaceCollection, this.workspaceId);
                var report = reportsResponse.Result.Value.FirstOrDefault(r => r.Name == "CustomerInvoices");
                var embedToken = PowerBIToken.CreateReportEmbedToken(this.workspaceCollection, this.workspaceId, report.Id, clientName, new string[] { "CustomerRole" });
                var model = new ReportViewModel
                {
                    Report = report,
                    AccessToken = embedToken.Generate(this.accessKey)
                };
                return model;
            }
        }
        public ActionResult Index()
        {
            var report = GetFilteredRepot("Action Bicycle Specialists");
            return View(report);
        }
    }
}

The interesting part of this controller is in the method GetFilteredRepot. First, it gets all the reports from our workspaces than look for the one named: "CustomerInvoices". The next step is where the loop gets closed; it creates the token. Of course, we pass the workspacecollection, workspace and report references, and that could be it. I mean passing only those references would result to our reports where all customers were displayed... But obviously that not what we want right now. The two last parameters are username and an Array of roles. When we created roles in Power BI Desktop, we created one call CustomerRole that was equal to the variable USERNAME(). So here we will pass the client name as username and specify that we want to use the role "CustomerRole".
Last piece to the puzzle is the View, so let add one.

@model demopowerbiembeded.Models.ReportViewModel
<style>iframe {border: 0;border-width: 0px;}</style>
<div id="test1" style="border-style: hidden;">
    @Html.PowerBIReportFor(m => m.Report, new { id = "pbi-report", style = "height:85vh", powerbi_access_token = Model.AccessToken })
</div>
@section scripts
{
    <script src="~/Scripts/powerbi.js"></script>
    <script>
        $(function () {
            var reportConfig = {
                settings: {
                    filterPaneEnabled: false,
                    navContentPaneEnabled: false
                }
            };
            var reportElement = document.getElementById('pbi-report');
            var report = powerbi.embed(reportElement, reportConfig);
        });
    </script>
}

One great advantage of using Asp.Net MVC is that we have an @Html.PowerBIReportFor at our disposal. Then we can instantiate the report with the call of powerbi.embed(reportElement, reportConfig);. Where I pass some configuration to remove the navigation, and the filter panes, but that optional.

Now if we run our project, you should have a result looking like that.

finalresult


Wrap it up

Viola! This of course was a demo and should be optimized. Please leave a comment if you have any questions, or don't hesitate to contact me. It's always great to chat with you.


References:



Reading Notes #272

Show_me_the_wayCloud


Programming


Databases



Where can I put my Data In Azure


This month, I’m the guest of Mario Cardinal (@mario_cardinal) and Guy Barrette (@GuyBarrette) on their Podcast The Visual Studio Talk Show.  A French Podcast that talk software architecture with Microsoft's technology. 
Alexandre Brisebois (@Brisebois) was also present on this episode, and the four of us spent about an hour talking about Data in Azure, and try to clarify the Microsoft offer.

You can listen to the episode here:  http://visualstudiotalkshow.libsyn.com/205-alexandre-brisebois-et-franois-boucher-les-donnes-et-azure

I did a little “Mindmap” before the show to help me keeping it as structured as possible. I’m sharing it with you here:
Azure Data_thumb

Version (3231x1130) here: http://cloudenfrancais.com/content/images/2017/03/Azure-Data.png

~Frank

Reading Notes #271

vs2017

 

 

Suggestion of the week

  • Azure Functions vs Web Jobs: How to choose? (Chris Pietschmann) - Wow! This is the best post I've read about Azure Function. It really describes clearly the differences between the two. A must for every cloud developer.

Cloud


Programming


Databases


Miscellaneous




Reading Notes #270

canada-mapSuggestion of the week


Cloud


Databases


Miscellaneous



Reading Notes #269

AzureFunctionSuggestion of the week


Cloud


Programming


Miscellaneous



Reading Notes #268

microsoft-integration-stencils-pack-v2-4_thumbCloud


Programming


Miscellaneous



Reading Notes #267

IMG_20170208_201247Cloud


Programming


Miscellaneous




Reading Notes #266

Retropie_SplashSuggestion of the week


Cloud


Programming


Databases

  • SQL Database Query Editor available in Azure Portal (Ninar Nuemah) - I was looking for this since the old query tool was removed. I will probably continue to use SQL studio management or VsCode, put what a time saving, and you are investigating a problem... Open a blade right from the Azure portal and voila!

Miscellaneous

  • MVP API Intro (Daron Yöndem) - I love it! I already have few ideas in mind, and I'm curious to see what you will do guys.


Reading Notes #265

msdevMtljan31Suggestion of the week


Cloud


Programming


Databases


Miscellaneous



Reading Notes #264

2017-01-22_21-13-48Cloud


Programming

  • Introducing Docker 1.13 (Docker Core Engineering) - This post summarizes all the great features added in the new release and shows again why Docker is such a fantastic tool in the containers' world.

Databases


Miscellaneous


Reading Notes #263

IMG_20170113_162910Cloud


Programming


Databases



Connect VSCode to Azure SQL Database from everywhere

Visual Studio Code looks like a simple text editor. However, the fact that it running from anywhere (Windows, Linux, MacOs) and that you can highly customize it, make it very special. The way you customize your VSCode is by using extensions. The last extension I installed was MSSQL. In this post, I will explain how to leverage this extension to connect from everywhere to your database.

Installation


Adding the extension is very simple you can go to the Extension menu and type mssqm in the search area; then click Install.

install_mssql

Configuration


To create your connection profile, you could just open Command Palette and type connect, the extension will then ask you to enter the server name, username, password…

connect_longway
To avoid any typo I tend to copy/paste the information, but every time the dialog was losing the focus it was closing. The workarounds to this unfortunate problem was to open the User Settings and specify my connection information. Here is the configuration:

"mssql.connections": [
    {
        "server": "{{put-server-name-here}}",
        "database": "{{put-database-name-here}}",
        "user": "{{put-username-here}}",
        "password": "{{put-password-here}}"
    },
    {
        "profileName": "Azure SQL FBoucher",
        "server": "sqlservername.database.windows.net",
        "database": "demodb",
        "user": "admin",
        "password": "",
        "authenticationType": "SqlLogin",
        "savePassword": true
    }
]

I kept password empty and savePassword equal to true. This way the first time you connect to the database you will need to enter it and it will be saved outside of the user Setting using the Password Management.
Now to connect you only need to open the Command Palette and type connect or mssql, to see the MS SQL: Connect option and select the connection profile you just created.

connect_profile


Using the mssql extension


Of course, you can type any query you like, but the extension is offering not only snippets, but a contextual auto-completion. That mean the VSCode once connected will know all the column’s names of all tables and boost as must your productivity.

snippets
To execute the query a simple Ctrl + E, and voila!

query_and_result


Wrapping up


This fantastic extension is providing a light-weight setup to execute or write any SQL scripts whatever you are on Windows, Linux or MacOS. It doesn’t have all the features of the bigger tool like SQL Server Management, when databases are not your primary focus or even to have a light setup on the go, the Visual Studio Code SQL Server extension is definitely a must.


References:




Reading Notes #262

2017Cloud


Programming


Databases


Miscellaneous

  • Identity vs Permissions (Dominick Baier) - Good post that demystifies some point between two distinct but very often mixed concept.


Reading Notes #261

gummibarchen-359950_960_720Suggestion of the week


Cloud


Programming


Miscellaneous




Need to Nuke an Azure Subscription?

(Ce billet en aussi disponible en français.)


I use very intensely my my.visualstudio (aka MSDN) Azure subscription, to create content for a demo or just to try new feature. So frequently I need to do some cleaning.

multi-resourceGroup-boom

Here a little script that will completely delete all resources of every resources group inside a specific subscription. To be able to execute this script you will need Azure PowerShell cmdlets.

The script asks you to login-in then list all the subscriptions that this account has access. Once you specify which one, it will list all the resource grouped by resource group. Then as a final warning, it will require one last validation before nuking everything.

Be careful.

#================================================================
#= Very dangerous interactive script that delete all rescources 
#= from all rescourcegroup in a specific subscription
#================================================================

# How to install and configure Azure PowerShell
# https://docs.microsoft.com/en-us/powershell/azureps-cmdlets-docs/

# Login
Login-AzureRmAccount 

# Get a list of all Azure subscript that the user can access
$allSubs = Get-AzureRmSubscription 

>$allSubs | Sort-Object Name | Format-Table -Property ame, SubscriptionId, State

$theSub = Read-Host "Enter the subscriptionId you want to clean"

Write-Host "You select the following subscription. (it will be display 15 sec.)" -ForegroundColor Cyan
Get-AzureRmSubscription -SubscriptionId $theSub | Select-AzureRmSubscription 

#Get all the resources groups
$allRG = Get-AzureRmResourceGroup

foreach ( $g in $allRG){
    Write-Host $g.ResourceGroupName -ForegroundColor Yellow 
    Write-Host "------------------------------------------------------`n" -ForegroundColor Yellow 
    $allResources = Find-AzureRmResource -ResourceGroupNameContains $g.ResourceGroupName
    if($allResources){
        $allResources | Format-Table -Property Name, ResourceName
    }else{
        Write-Host "-- empty--`n"
    } 
    Write-Host "`n`n------------------------------------------------------" -ForegroundColor Yellow 
}

$lastValidation = Read-Host "Do you wich to delete ALL the resouces previously listed? (YES/ NO)"
if($lastValidation.ToLower().Equals("yes")){
    foreach ( $g in $allRG){
        Write-Host "Deleting " $g.ResourceGroupName 
        Remove-AzureRmResourceGroup -Name $g.ResourceGroupName -Force -WhatIf
    }
}else{
    Write-Host "Aborded. Nothing was deleted." -ForegroundColor Cyan
}




The code is also available on Github: https://github.com/FBoucher/AzurePowerTools


How to use Azure Function App to crush an SQL Database on a schedule

In a project, I needed to run a task every day to process some data in an Azure SQL Database. I thought Azure Function App would be the perfect candidate for that because we can attach them on a schedule, and I will only get charges when they are running. In this post, I will create a function that will be executed every five minutes. It will read the information from an SQL table Person and write the stats in another SQL table Statistic.

Azure Setup


Let's start by creating the Azure Function App. From the Azure portal (portal.azure.com) click the "+" sign on the top left corner and in the Search textbox type Function App. Fill-up the creation form like usual. Note that it will be a good idea to put your Function Appin the same location to what they will be interacting with; in this case an SQL Database. Once the Function App is created, it will be possible to create a new function. For that you can start with an empty one or use one of the multiple templates available. For this post, the TimeTrigger-CSharp was perfectly indicated.

createAzureFunctionApp

On this page that you can configure the schedule. Here, I set it to 0 */5 * * * * because the task will be running every five minutes. It uses Cron expression, and to learn about it you can (should) refer to the documentation. You can edit that value later by going to the Integrate tab the function. Great, now that we have our function, we need to provide the SQL Database connection string. In this demo, I will use App Settings, but it could also be saved in the Azure Key Vault. Remember Function Apps are part of the Azure Apps ecosystem, thereby they have App Settings. To access it, it's really simple.

GoToAppSettings

In the left panel menu click on the Function app setting options, then in the many choices look for Configure app settings. That will open the usual app setting blade that we use with other Web Apps. Scroll down until you've reached the Connection String section. Add your connection string to your server... And don't forget to save!

saveConnStr

Database Setup


For this demo, I will use two simple tables.
CREATE TABLE [dbo].[Person] (
    PersonID     INT NOT NULL IDENTITY(1,1) PRIMARY key,
    Firstname    VARCHAR(50)  NOT NULL, 
    Lastname     VARCHAR(50)  NOT NULL,
    Age          INT  NOT NULL
)
I will insert manually records, adding people with a random age. The Function App on its side, will read the information from the table Person and calculate the average age and the number of people in the population. For finally inserted it into the table Statistic.
CREATE TABLE [dbo].[Statistic] (
    StatisticID    INT NOT NULL IDENTITY(1,1) PRIMARY key,
    Population     INT NOT NULL, 
    AverageAge     INT  NOT NULL,
    DateTaken      VARCHAR(50)  NOT NULL
)

Coding the Function App


The code of this function is not styled or optimized. It was kept very simple so it was obvious was it was doing. Here is the code.
#r "System.Data"
using System;
using System.Configuration;
using System.Data.SqlClient;
public static async Task Run(TimerInfo myTimer, TraceWriter log)
{
    log.Info($"C# Timer trigger function executed at: {DateTime.Now}");  
    var str = ConfigurationManager.ConnectionStrings["sqlConn"].ConnectionString;
    using (SqlConnection conn = new SqlConnection(str))
    {
        var sum = 0;
        var cnt = 0;
        var avr = 0;
        conn.Open();
        var sqlStr = "SELECT Age FROM [dbo].[Person]";
        using (SqlCommand cmd = new SqlCommand(sqlStr, conn))
        {
            var dataReader = await cmd.ExecuteReaderAsync();
            while(dataReader.Read()){
                sum += dataReader.GetInt32(0);
                cnt++;
            }
            avr = (sum / cnt);
            log.Info($"The average actual population is {avr} .");
        }
        var sqlInsert = $"INSERT INTO [dbo].[Statistic](Population, AverageAge, DateTaken) VALUES ( {cnt} , {avr} , '{DateTime.Now}' )";
        using(SqlCommand cmd  = new SqlCommand(sqlInsert, conn)){
            var rows = cmd.ExecuteNonQuery();
            log.Info($"{rows} rows were inserted");
        }
    }  
}
The first line is to add reference to System.Data. Many libraries are already available to any Function App, and just require a reference. When you need an external library, you will need to create a file project.json and add it to the dependencies. Those will be loaded via Nuget. From there, the code talk by itself. Beginning by getting the connection string from the config, getting all the people and doing his magic, then finally inserting his result into the Statistic table.

Let's Run It


Everything is now in place. It's t to use your favorite SQL tool to insert some record into the Person table. Personally, I've created a little console application in .Net Core to do that. I share all about it at the end of this post. Once it's done, you will see the table Statistic growing every five minutes. Voila! Azure Function App a very useful and effective in this scenario. The code, of course, can be part of your repository, and now that Azure Function Tools have just been released you can even code them from Visual Studio!


Little Bonus


For a will now, every time I need to generate data, I tend to use a nice framework that my MVPs buddy the ASP.Net Monsters has done called: GenFu. Available on github, GenFu is a library that generates realistic test data. Here the code I used to generate the population in this demo.
using System;
using System.Data.SqlClient;
using GenFu;
namespace ConsoleApplication
{
    public class Program
    {
        public static void Main(string[] args)
        {
            int counter = (args.Length > 0) ? Convert.ToInt32(args[0]) : 5;
            var Users = A.ListOf<Person>(counter);
            var connStr = "Server=YOURSERVERNAME.database.windows.net,1433;Initial Catalog=YOURDATABASENAME;Persist Security Info=False;User ID=YOURUSER;Password=YOURPASSWORD;MultipleActiveResultSets=True;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;";
            using (SqlConnection conn = new SqlConnection(connStr))
            {
                conn.Open();
                foreach (var u in Users)
                {
                    var sqlStr = $"INSERT INTO [dbo].[Person] (Firstname,Lastname,Age) VALUES ( '{u.Firstname}' , '{u.Lastname}' , {u.Age} )";
                    using (SqlCommand cmd = new SqlCommand(sqlStr, conn))
                    {
                        var rows = cmd.ExecuteNonQuery();
                        Console.WriteLine($"{rows} rows were inserted");
                    }
                }
                conn.Close();
            }
        }
    }
    public class Person
    {
        public string Firstname { get; set; }
        public string Lastname { get; set; }
        public int Age { get; set; }
    }
}
You will need to add Genfu reference to the project.json file.
{
"version": "1.0.0-*",
"buildOptions": {
    "debugType": "portable",
    "emitEntryPoint": true
},
"dependencies": {
        "Genfu": "1.2.1",
        "System.Data.SqlClient": "4.3.0"
},
"frameworks": {
    "netcoreapp1.0": {
    "dependencies": {
        "Microsoft.NETCore.App": {
        "type": "platform", 
        "version": "1.0.1"
        }
    },
    "imports": "dnxcore50"
    }
},
"runtimes": {
"win-x64":{}
}
}





Happy coding!




Reading Notes #260

shopping-cart-1275482_640Suggestion of the week


Cloud


Programming


Miscellaneous