Reading Notes #284

IMG_20170609_092421Cloud

Programming

Miscellaneous




Reading Notes #283

June9Suggestion of the week


Cloud


Programming


Databases


Miscellaneous


Reading Notes #282

favicon[1]Cloud


Programming


Let’s talk Hybrid Cloud with Clemens Vasters at a Special Montreal Event.

CVastersThe summer is mostly here, and that usually mean a break for the community group, but the Azure Group of MSDevMtl still have one more surprise for you. Clements Vaster, the Microsoft's "Principal Messenger", driving global technical strategy for Microsoft's Azure Messaging platform services: Azure Service Bus, Azure Event Hubs, and Azure Relay, is coming to Montreal!

Join us on the Friday June 9th 2017, for a great talk about “Hybrid cloud”. These days most solution need to run by a combination of on-premises and public cloud assets. In this morning session, we will learn how those systems can be integrated. This perspective is, however, largely constrained to data-center like assets, and typically to the scope of one organization.

For this special event, we decided to remove the usual fee, so it's a FREE event!

Register quickly on the meetup site because the places are limited. Register here: https://www.meetup.com/msdevmtl/events/239731426/


See you there!


Reading Notes #281

AzureCLICloud


Programming


Databases


Miscellaneous



Reading Notes #280

IMG_20170511_082902Cloud


Miscellaneous


From a Docker container to MySQL as a Service in Azure in 5 minutes

Hello MySQL! It's been a while eh? You were at version 3 something, I was just getting stated with my professional career. We had fun for years... Then you know things changed, and I did something else. I was really happy when Microsoft announced, at the MSBuild,  the availability of MySQL as a Servive in Azure.
 
SearchMySQL

Creating a MySQL database with the portal is extremely simple. As usual, you enter the server name, database name and the Admin's password. At the time I'm writing this post, it was not possible to use any CLI, but I'm sure it will be available shortly. For the ones who are not used at Database as Service in Azure, one thing you will need to do to get access to your database from your computer is white listed your IP. It's very easy to do from the Azure Portal, just select the Connection Security tab on the left menu and add your address. Oh! And don't forger to click the save button. ;)

Firewall

During my tests, I've tried different applications (WordPress, Azure WebApp, custom on-premise app.) that use MySQL as backend database, I didn't notice any problem, and performance were great. It was just... simpler; no server to configure, no VM to configure, no update. The only "issue" I got was trying to connect Power BI Desktop to a MySQL, but I think it more related to the drivers since the service was still in early preview. I notified Microsoft, and I'm sure it will be available shortly.

Since it's been a while since I did some reel work with MySQL I didn't have any client install on my laptop. In fact, I had no idea which one I should take.

I knew we can run some CLI inside a Docker container with an interactive interface. So I decided to give it a try. A quick docker search mysql shows me that an image existed. Here are the steps to get setup.

First, let's download the image, and create an instance named mySQLTools of MySQL 8.0:

docker run --name mySQLTools --env "MYSQL_ROOT_PASSWORD=Passw0rd" -d mysql:8

Then using the -it let's bring the bash prompt to our terminal.

docker exec -it mySQLTools bash -l 

Finally we connect to our client using the usual settings (note that you must have no space between -p and your password):

mysql -h _ServerName.database.windows.net_  -u _UserName@ServerName_ -p_MyPassword_ _DatabaseName_

result

Voila! That's all what it takes to get started. And by the way, it will also work great with a Azure SQL Database.


docker pull shellmaster/sql-cli 

docker run -it --rm --name=sqlTools shellmaster/sql-cli mssql -s ServerName.database.windows.net  -u UserName@ServerName  -p YourPassword -d DatabaseName -e





Reading Notes #279

IMG_20170506_070903Cloud


Programming

  • Contributing to .NET for Dummies (Rion Williams) - Another post where the author shares his experience (I love those. That's the real life); this one about participating in an open-source project.

Databases


Miscellaneous



A static website and some little tricks with Azure Functions Proxies

(Updated 2018-02-08)

Recently, I did few presentations about Azure functions. The reaction was always very positive and attendees leave with tons of ideas of projects in their heads. In this post, I would like to add few interesting features that I didn't have the time to talk about.

You prefer to watch a video instead of reading? No problem, skip at the end at the Explain in a Video of this post immediately.

Let's get started


From the Azure Portal (portal.azure.com) select an Azure Function domain, or create a new one. Then we need to create a Function App that we will be use as our backend. Click on the "+" sign at the side of Function. In this post, we will be using the HttpTrigger-CSharp template, but other template will work too. Once you select the template you will be able to enter the name and select the Authorization level. This last choice will affect how your function could be accessed. For exemple, if you select Anonymous then you function will be accessible to everyone directly using the url: https://notesfunctions.azurewebsites.net/api/SecretFunction, where 'notesfunctions' is my function domain name. But if you select Function or Admin level, then you will need to pass a Function Key or Master Key (ex: https://notesfunctions.azurewebsites.net/api/SecretFunction?code=I4BN6NjaZBmPNqebqnX8pQAPZwe1TI/O4TCbvB1aaaaao7uRO5yyw==). For this post let's use the Function level. When ready click the Create button.

CreateFunction

Using Postman, your favorite HTTP tool, or even the function Test section (located on the right side of the editor in the Function blade), you can now test your Function. To be able to test our function, we need to know the URL. To get the URL of your functions, once you function is selected (when you see the code), click on </> Get function URL on the top of the screen.

TestSecretDirect

Note that the querystring as a parameter named code that is receiving our function key. When this parameter is not present, you will receive an HTTP 401 Unauthorized message. The function generated by the template also expects a value 'Name' that could be passed by the querystring or by a json file with a property Name in the http request body.

Azure Functions Proxies


Functions Proxies are currently in preview. With them, any function app can now define an endpoint that serves as a reverse proxy to another [API / webhook / function App / anything else].
Before being able to create new Azure Functions Proxies, you need to enable them. From the Function blade, select the Settings tab on the top of the screen, then click the On button, under the Proxie section.

EnableProxies

Now let's create our first Function Proxy. Click on the "+" on the right of Proxies (Preview). Enter the following values.

ProxySalutation

In the Backend URL note as %Host_Name% is used in the URL; this is NOT an environment variable. In fact surrounding a key with % is a very useful tool from the Azure Function that gives us the ability to read directly in the Application settings.

PlatformFeatures

To get to the Application settings, select the Function Application domain (the root node), then the tab Platform features from the top of the screen. In the image above, Point A shows on to access the Application settings, and Point B shows how to access the App Service Editor that will use later in this post.

If it's not already done, add a new key-value in Application setting: Host_Name with his value. Then from Postman, call this new proxy function. Note that now you don't need to pass the key since this part is done under the hood by the proxy.

TestSalutation

Do more with your Proxies


Okay, now that we have a proxy up and running, let's switch to the App Service Editor to do more "advanced" stuff (the Editor is available throuth the Platform features tab). Once you are in the editor select the file proxies.json to open it.

editor

As you can see we only have one proxy defined. Let's duplicate our proxy. Rename the copy "Override", and change the route value for override too. If you test this new proxy, it will work just as the other one. Let's change that a little, under the property backendUri add a new node called: responseOverrides. It is possible with proxies to edit the HTTP properties. To change the Content-Type to text instead of json add "response.headers.Content-Type": "text/plain" inside our new node responseOverrides (be aware, it's case sensitive). Test again Override and you will constate that the content indeed has changed.

Continuing that way you count use Azure Function Proxies as mock. For example, replace the backendUri property and override the response body to return a fix value, and voila! You built yourself a great mock-up! This is very useful! To illustrate this, add a new proxy using this code:
"Fake": {
    "matchCondition": {
        "route": "fake"
    },
    "responseOverrides": {
        "response.headers.Content-Type": "text/plain",
        "response.body": "Hello from Azure"
    }
}
If you call this last proxy, no backend will be called, but the HTTP call is working.

Static WebSite


Everybody knows that Azure storage is very inexpensive. Would it be wonderful if we could put a static website in that storage? Of course, you can do it, I mean as long as the URL was complete. However, who type the URL completely with the file and file extention (ex: http://www.frankysnotes.com/index.html)? Well now with Azure Function Proxy, we could fix that! Add another proxy to the proxies.json file using this code:
"StaticNotes": {
    "matchCondition": {
        "methods": [
            "GET"
        ],
        "route": "/"
    },
    "backendUri": "https://%blob_url%/dev/index.html"
}
This new proxy will "redirect" all root HTTP GET calls to our index.html file waiting in our Azure Blob storage. For a more professional look, you just need to add a custom domain name to your Function, and you got the perfect super-light low-cost website for your promotion campaign, or event.

static


Explain in a Video





References:

  • Postman : getpostman.com
  • App Service Editor: https://{function domain name}.scm.azurewebsites.net (ex: https://notesfunctions.scm.azurewebsites.net)



Reading Notes #278

azure_functions_featured_imageCloud


Programming


Miscellaneous

  • Designing a Conversation (Alexandre Brisebois) - Interesting post that digs into the paradox where "us", humans have been communicated since our beginnings but still have trouble doing it, now we want to plan ahead and architect communication with machines.
  • Introduction to Microsoft To-Do (Gunnar Peipman) - Interesting app Microsoft finally did his ToDo service.



Reading Notes #277

IMG_20170422_130532Cloud

Programming

Miscellaneous

  • Bots are the new Apps – Part 2 (Alexandre Brisebois) - This post asks a lot of questions. .Bots are could be very powerful but are they easy to build? Iterate, test, fail, learn, and try again.


Two Ways To Build a Recursive Logic App

Mostly everything is possible. It's always a question of how much time and energy we have. The other day I was building an Azure Logic App, and need it to make it recursive. First, I thought that couldn't be a problem because we can easily call a Logic APP from another one. This is right, nested Logic App is possible, but it's only possible to call another one. The "compilator" doesn't allow to do recursive call. I quickly found a workaround, but the day after I came to a cleaner solution.

In this post, I will share both ways to create a recursive call of a Logic App.

Commun Parts


Let's assume that our goal is to crawl a folder structure. It could be in any file connector: DroxBox, OneDrive, Google Drive, etc. For this post, I will use Sharepoint Online connector.

First, let's create our Logic App. Use the Http Request-Response template.

  1. Our Logic App will receive the folder path, that it needs to process, as a parameter. This is easily done by defining a JSON schema in the Request trigger.
    {
        "$schema": "http://json-schema.org/draft-04/schema#",
        "properties": {
            "FolderName": {
            "type": "string"
            }
        },
        "type": "object"
    }
  2. To list all the content of the current folder. Add an action List folder from the SharePoint Online connector. The File identifier should be the trigger parameter: FolderName.

    Note: You need to edit the code behind for this action. I notice a strange behavior with space in the folder path.

    The current code should look like this:
    "path": "/datasets/@{encodeURIComponent(encodeURIComponent('https://mysharepoint.sharepoint.com/sites/FrankSiteTest'))}/folders/@{encodeURIComponent(triggerBody()?['FolderName'])}" 

    Change it by doubling the encodeURIComponent
    "path": "/datasets/@{encodeURIComponent(encodeURIComponent('https://mysharepoint.sharepoint.com/sites/FrankSiteTest'))}/folders/@{encodeURIComponent(encodeURIComponent(triggerBody()?['FolderName']))}" 
  3. For each element returned we need to check if it's a folder. One property of the returned object is IsFolder, we just need to use it in our condition:
    @equals(bool(item()?['IsFolder']), bool('True'))
    1. If it's a folder, we need to do some recursive call passing the path of the current folder to FolderName.
    2. Otherwise, when it's a file, do some process.

LogicApp_commun

Method 1: The quick and easy


Since we are forced to call a different Logic App, let's clone our Logic App. Close the editor and click the Clone button, name it FolderCrawler2.

cloneLogicApp

Now we need to edit both Logic apps by adding a call to the other one. FolderCrawler is calling FolderCrawler2 and FolderCrawler2 calls FolderCrawler.

FolderCrawler2

This method is really just a workaround, but it works perfectly. What I like about it is that it uses all the Intellisense at our disposal in the editor. Obviously, the big disadvantage is the code duplication.

Method 2: The Clean and light


The real way to do a recursive call is to use the URL from the Request in an HTTP POST call. Than in the body pass a JSON matching the schema containing the Path value.

httpcall

I hope you enjoy this little post. If you think of another way to do recursive call or if you have some questions, let me know!





Reading Notes #276

roslynSuggestion of the week


Cloud


Programming


Miscellaneous

Reading Notes #275

IMG_20170409_152323Cloud


Programming


Miscellaneous



~Frank



Reading Notes #274

2017-04-03_5-57-51Suggestion of the week


Cloud


Programming



Passing a file from an Azure Logic App to a Web API

(Ce billet en aussi disponible en français.)

Logic App is one of my favorite tools in my cloud toolbox. It's very easy to connect things together, something without even coding! Last week, I needed to pass a file from a SharePoint folder to an API. I moved files tons of times using Azure Logic Apps, but this time something was not working. Thanks to Jeff Hollan (@jeffhollan) who put me on the good path by giving me great advice, my problem was quickly solved. In this post, I will share with you the little things that make all the difference in this case.

The Goal


When a file is created in a SharePoint folder, an Azure Logic App needs to get triggered and passes the file name and its content to a Web Api. In this case, I'm using Sharepoint, but it will work the same way for all folder connector types (ex: DropBox, OneDrive, Box, GoogleDrive, etc.)

Note:
In this post, I'm using a SharePoint Online, but the same thing could perfectly work with a SharePoint on premise or in a Virtual machine. In this situation, On-premise Data Gateway needs to be installed locally. It's very easy to do, just follow the instruction. One gotcha... You MUST use the same Microsoft account of type "work or school" to connect to the Azre.portal.com and installing the On-premise Data Gateway.

The Web API App


Let's start by building our Web API. In Visual studio create a new Web API App. If you would like to have more details about how to create one see my previous post. Now, create a new controller and add a new function UploadNewFile with the following code:

[SwaggerOperation("UploadNewFile")]
[SwaggerResponse(HttpStatusCode.OK)]
[Route("api/UploadNewFile")]
[HttpPost]
public HttpResponseMessage UploadNewFile([FromUri] string fileName)
{
    if (string.IsNullOrEmpty(fileName))
    {
        return Request.CreateResponse(HttpStatusCode.NoContent, "No File Name.");
    }

    var filebytes = Request.Content.ReadAsByteArrayAsync();

    if (filebytes.Result == null || filebytes.Result.Length <= 0)
    {
        return Request.CreateResponse(HttpStatusCode.NoContent, "No File Content.");
    }

    // Do what you need with the file.

    return Request.CreateResponse(HttpStatusCode.OK);
}

The tag [FromUri] before the parameter is just a way to specify where that information is coming from. The content of the file couldn't be passed in the querystring, so it will be passed through the body of our HTTP Request. And it will be retrieved with the code Request.Content.ReadAsByteArrayAsync(). If everything works we return a HttpResponseMessage with the HttpStatusCode.OK otherwise some message about the problem. You can now publish your Wep API App.

In order to be able to see our WebAPI App from our Logic App, one more thing needs to be done. From the Azure portal, select the freshly deployed App Service and from the options section (the left area with all properties) select CORS, then type * and save it.

changeCORS

The Logic App


Assuming that you already have a SharePoint up and running, let's create the new Logic App. Once the Logic App is deployed click the edit button to go in the designer. Select the Blank template. In this post, I need a SharePoint trigger when a New File is created. At this point, you will be asked to answer a few questions in order to create your SharePoint connector. Once it's done select the folder where you will be "dropping" your files.

Now that the trigger is done, we will add our first (an only) action. Click Add Step. Select available functions, then our App Service and finally the method UploadNewFile.
SelectApiApp
Thanks to swagger, Logic App will be able to generate a parameter form for us. Put the filename in the Filename parameter textbox. The Logic App should look like this.

FullLogicApp

The last thing we need to do is specify to our Logic App to pass the file content to the body of the HTTP request to the API. Today, it's not possible to do it using the interface. As you probably know, behind that gorgeous sits a simple json document, and it's by editing this one that we will be able to specify how to pass the file content.

Switch to Code view, and find the step that calls our API App. Simply add "body": "@triggerBody()" to that node. That will tell Logic App to bind the body of the trigger (the file content) and pass-it to the body of our web request. The code should look like this:

"UploadNewFile": {
    "inputs": {
        "method": "post",
        "queries": {
        "fileName": "@{triggerOutputs()['headers']['x-ms-file-name']}"
        },
        "body": "@triggerBody()",
        "uri": "https://frankdemo.azurewebsites.net/api/UploadNewFile"
    },
    "metadata": {
        "apiDefinitionUrl": "https://frankdemo.azurewebsites.net/swagger/docs/v1",
        "swaggerSource": "website"
    },
    "runAfter": {},
    "type": "Http"
}

You can now save and exit the edit mode. The solution is ready, enjoy!

References:

Reading Notes #273

Frank_AzureFunction-2Cloud


Programming


Miscellaneous



Secure a Asp.Net MVC multi-tenant Power Bi Embedded hosted in an Azure WebApp

Note: This post was originally published on Microsoft MVP Award blog, as part of the Technical Tuesday series.

Power Bi gives us the possibility to create amazing reports. Even if it's great to be able to share those reports from the very secure Power Bi portal sometimes we need to share them inside other applications or websites. Once again, Power BI doesn't disappoint us by providing Power BI Embedded. In this post, I will explain how to use Power Bi Embedded and make it secure so each tenant can only his data.

The Problem

Despite many online exist that explain how to use filters to change the witch is visible in our reports, filters can easily be changed by the user. Even if you hide the filter panel, those setting could easily be modified using JavaScript... Therefor, it's definitely not the best way to secure private information.

The Solution

In this post, I will be using roles to limit the access the data. The well knew the database Adventure Works will be used to demonstrate how to partition the data. In this case will be using the customer table.

In Azure

Open the Azure portal to create a Power BI Embedded component. Of course in a real project, it would be better to create it in an Azure Resource Management (ARM) template, but to keep this post simple we will create it with the portal. Click on the big green "+" at the top left corner. In the search box type powerbi, and hit Enter. Select Power BI Embedded in the list and click the Create button. Once it's created go to the Access Keys property of the brand-new Power BI Workspace Collection and take note of Key. We will need that key later to upload our Power BI report.

CreateWorkSpaceCollection

For this demo, the data source will be Adventure Works in an Azure Database. To do it simply click again the "+" button and select Database. Be sure to select Adventure Works as the source if to reproduce this demo.

createDB


In Power BI Desktop

Power BI Desktop is a free tool from Microsoft that will help us to create our report; it can be download here.
Before we get started, two options need to be modified. Go in the File menu and select Options and Settings, then Options. The first onr, is in the section (tab) Preview Features; check the option: Enable cross filtering in both direction for DirectQuery. The second is in the section DirectQuery, check the option Allow unrestricted measures in DirectQuery mode. It's a good idea to restart Power BI Desktop before continuing.

powerbioptions

To create our reports we first need to connect to our datasource, in this case our Azure Database. Click the Get Data button, then Azure and after that Microsoft Azure SQL Database. It's important to be attentive on the type of connection Import or Direct Query, because you won't be able to change it after. You will need to rebuild your report from scratch. For this case select DirectQuery.
This chart will be displaying information about invoice detail. Be sure to include the table that will be used for your role. In this case, I will be using Customer. Each customer must see only their invoices.

 tables

The report will contain two charts: the left one is a bar chart where you see the invoice historic, the right one is a pie chart that shows how products in the invoice(s) are distributed by category.
Note: in the sample database all customer have only one invoice and hey are all at the same date

chart_noRole

Now we need to create our dynamic Role. In the Modeling tab click on Manage Roles and create a CustomerRole mapping the CompanyName of the customer table to the variable USERNAME()

genericRole

Of course, to test if our charts are really dynamics, create other roles, and give them specific values ex: "Bike World" or "Action Bicycle Specialists". To visualize your report as those user, simply click on the View as Roles, in the Modeling tab, and select the role you want.

ViewAs

See how the charts look when see from "Action Bicycle Specialists".

chart_withRole

The report is now ready. Save it and we will need it soon.


Powerbi-cli

To upload our report in our Azure Workspace Collection, I like to use PowerBI-CLI because it runs everywhere, thanks to Node.js.
Open a command prompt or Terminal and execute the following command to install PowerBI-CLI:
npm install powerbi-cli -g
Now if you type 'powerbi' you should have the powerbi-cli help display.

powerbicli

It's time to use the access key we got previously, and use it in this command to create a workspace in our workspace collection.

//== Create Workspace ===========
powerbi create-workspace -c FrankWrkSpcCollection -k my_azure_workspace_collection_access_key

Now, let's upload our Power BI report into Azure. Retrieve the workspace ID returned by the previous command and pass it as the parameter -w (workspace).

//== Import ===========
powerbi import -c FrankWrkSpcCollection -w workspaceId -k my_azure_workspace_collection_access_key -f "C:\powerbidemo\CustomerInvoices.pbix" -n CustomerInvoices -o

Now we will need to update the connectionstring of our dataset. Get his ID with the following command:

//== Get-Datasets ===========
powerbi get-datasets -c FrankWrkSpcCollection -w workspaceId -k my_azure_workspace_collection_access_key 

Now update the connectionstring, passing the datasetId with the parameter -d:

//== update-connection ===========
powerbi update-connection -c FrankWrkSpcCollection -w workspaceId -k my_azure_workspace_collection_access_key -d 01fcabb6-1603-4653-a938-c83b7c45a59c -u usename@servername -p password


In Visual Studio

All the PowerBi Embeded part is now completed. Let's create the new Asp.Net MVC Web Application. A few Nuget packages are required, be sure to have those versions or newest:
  • Microsoft.PowerBI.AspNet.Mvc version="1.1.7"
  • Microsoft.PowerBI.Core version="1.1.6"
  • Microsoft.PowerBI.JavaScript version="2.2.6"
  • Newtonsoft.Json version="9.0.1"
By default Newtonsoft.Json is already there but needs an upgrade.
Update-Package Newtonsoft.Json
And for the Microsoft.PowerBI one, an install command should take care of all the other dependencies.

Install-Package Microsoft.PowerBI.AspNet.Mvc

We also need to add all the access information we previously used in our powerbi-Cli into our application. Let's add them in the web.config.

...
<appSettings>
    <add key="powerbi:AccessKey" value="my_azure_workspace_collection_access_key" />
    <add key="powerbi:ApiUrl" value="https://api.powerbi.com" />
    <add key="powerbi:WorkspaceCollection" value="FrankWrkSpcCollection" />
    <add key="powerbi:WorkspaceId" value="01fcabb6-1603-4653-a938-c83b7c45a59c" />
</appSettings>
...

Here the code of the InvoicesController:

using System;
using System.Configuration;
using System.Linq;
using System.Web.Mvc;
using demopowerbiembeded.Models;
using Microsoft.PowerBI.Api.V1;
using Microsoft.PowerBI.Security;
using Microsoft.Rest;
namespace demopowerbiembeded.Controllers
{
    public class InvoicesController : Controller
    {
        private readonly string workspaceCollection;
        private readonly string workspaceId;
        private readonly string accessKey;
        private readonly string apiUrl;
        public InvoicesController()
        {
            this.workspaceCollection = ConfigurationManager.AppSettings["powerbi:WorkspaceCollection"];
            this.workspaceId = ConfigurationManager.AppSettings["powerbi:WorkspaceId"];
            this.accessKey = ConfigurationManager.AppSettings["powerbi:AccessKey"];
            this.apiUrl = ConfigurationManager.AppSettings["powerbi:ApiUrl"];
        }
        private IPowerBIClient CreatePowerBIClient
        {
            get
            {
                var credentials = new TokenCredentials(accessKey, "AppKey");
                var client = new PowerBIClient(credentials)
                {
                    BaseUri = new Uri(apiUrl)
                };
                return client;
            }
        }
        public ReportViewModel GetFilteredRepot(string clientName)
        {
            using (var client = this.CreatePowerBIClient)
            {
                var reportsResponse = client.Reports.GetReportsAsync(this.workspaceCollection, this.workspaceId);
                var report = reportsResponse.Result.Value.FirstOrDefault(r => r.Name == "CustomerInvoices");
                var embedToken = PowerBIToken.CreateReportEmbedToken(this.workspaceCollection, this.workspaceId, report.Id, clientName, new string[] { "CustomerRole" });
                var model = new ReportViewModel
                {
                    Report = report,
                    AccessToken = embedToken.Generate(this.accessKey)
                };
                return model;
            }
        }
        public ActionResult Index()
        {
            var report = GetFilteredRepot("Action Bicycle Specialists");
            return View(report);
        }
    }
}

The interesting part of this controller is in the method GetFilteredRepot. First, it gets all the reports from our workspaces than look for the one named: "CustomerInvoices". The next step is where the loop gets closed; it creates the token. Of course, we pass the workspacecollection, workspace and report references, and that could be it. I mean passing only those references would result to our reports where all customers were displayed... But obviously that not what we want right now. The two last parameters are username and an Array of roles. When we created roles in Power BI Desktop, we created one call CustomerRole that was equal to the variable USERNAME(). So here we will pass the client name as username and specify that we want to use the role "CustomerRole".
Last piece to the puzzle is the View, so let add one.

@model demopowerbiembeded.Models.ReportViewModel
<style>iframe {border: 0;border-width: 0px;}</style>
<div id="test1" style="border-style: hidden;">
    @Html.PowerBIReportFor(m => m.Report, new { id = "pbi-report", style = "height:85vh", powerbi_access_token = Model.AccessToken })
</div>
@section scripts
{
    <script src="~/Scripts/powerbi.js"></script>
    <script>
        $(function () {
            var reportConfig = {
                settings: {
                    filterPaneEnabled: false,
                    navContentPaneEnabled: false
                }
            };
            var reportElement = document.getElementById('pbi-report');
            var report = powerbi.embed(reportElement, reportConfig);
        });
    </script>
}

One great advantage of using Asp.Net MVC is that we have an @Html.PowerBIReportFor at our disposal. Then we can instantiate the report with the call of powerbi.embed(reportElement, reportConfig);. Where I pass some configuration to remove the navigation, and the filter panes, but that optional.

Now if we run our project, you should have a result looking like that.

finalresult


Wrap it up

Viola! This of course was a demo and should be optimized. Please leave a comment if you have any questions, or don't hesitate to contact me. It's always great to chat with you.


References:



Reading Notes #272

Show_me_the_wayCloud


Programming


Databases



Where can I put my Data In Azure


This month, I’m the guest of Mario Cardinal (@mario_cardinal) and Guy Barrette (@GuyBarrette) on their Podcast The Visual Studio Talk Show.  A French Podcast that talk software architecture with Microsoft's technology. 
Alexandre Brisebois (@Brisebois) was also present on this episode, and the four of us spent about an hour talking about Data in Azure, and try to clarify the Microsoft offer.

You can listen to the episode here:  http://visualstudiotalkshow.libsyn.com/205-alexandre-brisebois-et-franois-boucher-les-donnes-et-azure

I did a little “Mindmap” before the show to help me keeping it as structured as possible. I’m sharing it with you here:
Azure Data_thumb

Version (3231x1130) here: http://cloudenfrancais.com/content/images/2017/03/Azure-Data.png

~Frank