Books
Listen Like You Mean It: Reclaiming the Lost Art of True ConnectionAuthor: Ximena Vengoechea
Making Numbers Count: The Art and Science of Communicating Numbers
Author: Chip Heath, Karla Starr


It Doesn't Have to Be Crazy at Work

Azure REST APIs with Insomnia (Jon Gallant) - Awesome post I didn't know that extension and they had never tried the import. Very useful.
Microsoft Ignite 2021 Book of News (Microsoft) - If you are looking for the complete list of all the news done at Microsoft Ignite here it is.
Top 5 things to get the most out of Microsoft Ignite (Rick Claus) - Nice tips to get ready.
Tip 307 - How to create faster and smarter apps with .NET 5 | Azure Tips and Tricks - A nice quick review of .Net improvements.
.NET Interactive with SQL!| .NET Notebooks in Visual Studio Code (Maria Naggaga) - A very interesting tool when you want to do a quick exploration of the data or invalidate some queries while in your code.
Don't forget about the GitHub Command Line (Scott Hanselman) - I forgot about that CLI... That could be useful.
Generate iCal calendar with .NET using iCAL.NET (Kristoffer Strube) - A nice quick tutorial for that package.Looking forward to seeing in the latest .Net version.
IntelliSense for appsettings.json | ASP.NET Blog (James Newton-King) - That's a cool feature! How many times I was in the settings and struggling on the format, or name of a properties.No more Internet search!
What Makes VSCode so Popular?. Visual Studio Code is bae | by Richard So | Feb, 2021 | codeburst (Richard So) - A great post that will put words into your thoughts. And if you don't use vsCode, it will explain why so many people around love it that much.
You must be insane to be a software developer (Matt Lacey) - Interesting question.
Visual Studio Code comes to Raspberry Pi - Raspberry Pi (Jim Bennett) - Great post from Jim. You can feel his excitement about this news... It's really motivating.
Master These Three Skills to Improve Time Management 🚀 (Anastasia) - Wow. This post contains tons of information, references, and good advice. To read.
Creating a Running Game in VueJS - Part 3 - Component Files and Installing Vue - The Long Walk (Paul Michaels) - Interesting step by spet tutorial part 3 of a series.
How to Install Windows Software Remotely Using PowerShell and Chocolatey - Petri (Russell Smith) - A very cool and simple way to solve a common difficulty why all those new people not working in the office.
Open-source HTTP API packages and tools (Brady Gaster) - Excellent tutorial, second post of a series. It's definitely a great place to get started.
Save the Date for .NET Conf: Focus on Windows, February 25, 2021 (Katie Akrop DeSantis) - Looking forward to this event. Thursday the 25th.
Creating Discoverable HTTP APIs with ASP.NET Core 5 Web API | ASP.NET Blog (Brady Gaster) - An extremely complete tutorial to learn more about API.
Fun times with batch files : Algorithms for the masses - julian m bucknall (Julian M Bucknall) - Nice journey in the bach world.
Scheduling Integration with the Azure Health Bot (Gregory Lisiak) - Nice way to use technology, today, to help to educate people.
The simplest way to get started is to work with one file, then convert it to a function. The way to that is to replace the path of the csv file by a variable that will be passed as a parameter: (filePath) =>
.
To keep the code as simple as possible, I kept the function and the looping query separated, but they can be merged in only query. // -- fcnCleanOneCSV_v2 ----------------------------------------
(filePath) =>
let
fnRawFileContents = (fullpath as text) as table =>
let
Value = Table.FromList(Lines.FromBinary(File.Contents(fullpath)),Splitter.SplitByNothing())
in Value,
Source = fnRawFileContents(filePath),
#"Daily Usage Row" = Table.SelectRows(Source, each Text.Contains([Column1], "Daily Usage")),
#"DailyPosition" = Table.PositionOf(Source, #"Daily Usage Row" {0}),
#"TopRemoved" = Table.Skip(Source, (DailyPosition + 1)),
#"Result" = Table.PromoteHeaders(TopRemoved)
in
Result
The first part is to load the content of the file as a one column table. Then DailyPosition
is used to store the position where Daily Usage data starts. This value is used in Table.Skip(Source, (DailyPosition + 1))
to keep only the rows after, since Daily usage is the last recordSet it works perfectly. //== Process Folder CSV_v2 for Daily Usage==============================
let
Source = Folder.Files("C:\Azure_Consumption_demo\CSV_v2\"),
MergedColumns = Table.CombineColumns(Source,{"Folder Path", "Name"},Combiner.CombineTextByDelimiter("", QuoteStyle.None),"Merged"),
RemovedOtherColumns = Table.SelectColumns(MergedColumns,{"Merged"}),
#"Results" = Table.AddColumn(RemovedOtherColumns , "GetCsvs", each fcnCleanOneCSV_v2([Merged])),
#"Removed Columns" = Table.RemoveColumns(Results,{"Merged"}),
#"Expanded GetCsvs" = Table.ExpandTableColumn(#"Removed Columns", "GetCsvs", {"Usage Date,Meter Category,Meter Id,Meter Sub-category,Meter Name,Meter Region,Unit,Consumed Quantity,Resource Location,Consumed Service,Resource Group,Instance Id,Tags,Additional Info,Service Info 1,Service Info 2"}, {"Usage Date,Meter Category,Meter Id,Meter Sub-category,Meter Name,Meter Region,Unit,Consumed Quantity,Resource Location,Consumed Service,Resource Group,Instance Id,Tags,Additional Info,Service Info 1,Service Info 2"}),
#"Demoted Headers" = Table.DemoteHeaders(#"Expanded GetCsvs"),
#"Split Column by Delimiter" = Table.SplitColumn(#"Demoted Headers","Column1",Splitter.SplitTextByDelimiter(","),{"Column1.1", "Column1.2", "Column1.3", "Column1.4", "Column1.5", "Column1.6", "Column1.7", "Column1.8", "Column1.9", "Column1.10", "Column1.11", "Column1.12", "Column1.13", "Column1.14", "Column1.15", "Column1.16"}),
#"Changed Type" = Table.TransformColumnTypes(#"Split Column by Delimiter",{{"Column1.1", type text}, {"Column1.2", type text}, {"Column1.3", type text}, {"Column1.4", type text}, {"Column1.5", type text}, {"Column1.6", type text}, {"Column1.7", type text}, {"Column1.8", type text}, {"Column1.9", type text}, {"Column1.10", type text}, {"Column1.11", type text}, {"Column1.12", type text}, {"Column1.13", type text}, {"Column1.14", type text}, {"Column1.15", type text}, {"Column1.16", type text}}),
#"Promoted Headers" = Table.PromoteHeaders(#"Changed Type"),
#"Changed Type1" = Table.TransformColumnTypes(#"Promoted Headers",{{"Usage Date", type date}, {"Meter Region", type text}}),
#"Replaced Value" = Table.ReplaceValue(#"Changed Type1","""","",Replacer.ReplaceText,{"Meter Category", "Meter Id", "Meter Sub-category", "Meter Name", "Meter Region", "Unit", "Resource Location", "Consumed Service", "Instance Id", "Tags", "Additional Info", "Service Info 1", "Service Info 2"}),
#"Changed Type2" = Table.TransformColumnTypes(#"Replaced Value",{{"Consumed Quantity", type number}})
in
#"Changed Type2"
From row 1 to 6, we get all the file in the folder then combine columns to get a full path for each file. We then pass that to our function previously defined. With the command Table.SplitColumn
, on line 11, we re-built the result as a table with multiple columns.Table.Range
, since the rows that we are looking for are between Provisioning Status and Daily Usage.//== fcnGetStatement ==========================================
(filePath) =>
let
fnRawFileContents = (fullpath as text) as table =>
let
Value = Table.FromList(Lines.FromBinary(File.Contents(fullpath)),Splitter.SplitByNothing())
in Value,
Source = fnRawFileContents(filePath),
#"Daily Usage Row" = Table.SelectRows(Source, each Text.Contains([Column1], "Daily Usage")),
#"DailyPosition" = Table.PositionOf(Source, #"Daily Usage Row" {0}),
#"Statement Row" = Table.SelectRows(Source, each Text.Contains([Column1], "Statement")),
#"StatementPosition" = Table.PositionOf(Source, #"Statement Row" {0}),
#"SelectedRows" = Table.Range(Source,(StatementPosition+1),(DailyPosition - StatementPosition )-2),
#"Result" = Table.PromoteHeaders(SelectedRows)
in
Result
And once again we loop through every file and do some clean-up.//== Query Statements ========================================
let
Source = Folder.Files("C:\Azure_Consumption_demo\CSV_v2\"),
MergedColumns = Table.CombineColumns(Source,{"Folder Path", "Name"},Combiner.CombineTextByDelimiter("", QuoteStyle.None),"Merged"),
RemovedOtherColumns = Table.SelectColumns(MergedColumns,{"Merged"}),
#"Results" = Table.AddColumn(RemovedOtherColumns , "GetCsvs", each fcnGetStatement([Merged])),
#"Removed Columns" = Table.RemoveColumns(Results,{"Merged"}),
#"Expanded GetCsvs" = Table.ExpandTableColumn(#"Removed Columns", "GetCsvs", {"Billing Period,Meter Category,Meter Sub-category,Meter Name,Meter Region,SKU,Unit,Consumed Quantity,Included Quantity,Within Commitment,Overage Quantity,Currency,Overage,Commitment Rate,Rate,Value"}, {"Billing Period,Meter Category,Meter Sub-category,Meter Name,Meter Region,SKU,Unit,Consumed Quantity,Included Quantity,Within Commitment,Overage Quantity,Currency,Overage,Commitment Rate,Rate,Value"}),
#"Demoted Headers" = Table.DemoteHeaders(#"Expanded GetCsvs"),
#"Split Column by Delimiter" = Table.SplitColumn(#"Demoted Headers","Column1",Splitter.SplitTextByDelimiter(",", QuoteStyle.Csv),{"Column1.1", "Column1.2", "Column1.3", "Column1.4", "Column1.5", "Column1.6", "Column1.7", "Column1.8", "Column1.9", "Column1.10", "Column1.11", "Column1.12", "Column1.13", "Column1.14", "Column1.15", "Column1.16"}),
#"Promoted Headers" = Table.PromoteHeaders(#"Split Column by Delimiter"),
#"Replaced Value" = Table.ReplaceValue(#"Promoted Headers","""","",Replacer.ReplaceText,{"Meter Category", "Meter Sub-category", "Meter Name", "Meter Region", "SKU", "Unit"})
in
#"Replaced Value"