Jekyll2021-12-13T15:45:08+08:00https://sarti.dev/feed.xmlGlenn SartiStuff about Windows, Devops, Puppet and Neo4j relatedGlenn SartiPresentation - DevOps Killed Silos2021-09-29T00:00:00+08:002021-09-29T00:00:00+08:00https://sarti.dev/presentation/devops-killed-silos<p>The Perth DevOps Meetup was running an “Unpopular Opinions” talk night. Each presenter had three minutes and no slides, to talk about their unpopular opinion; Whatever that meant!</p>
<p>I choose to talk about how organisationl silos aren’t the evil things that DevOps purports to be. In particular this is referencing an article on how <a href="https://www.rubick.com/how-to-build-silos-and-decrease-collaboration/">Increasing collaboration can do harm</a> by Jade Rubick.</p>
<p>The talk is basically a classical tragedy (think Macbeth or Dr. Faust) but really, really really condensed 😅 into three minutes; Almost one minute per part.</p>
<p>Presentation - None. No slides!</p>
<p>Recording - Not recorded</p>
<h1 id="devops-murdered-silos---a-tragedy-in-3-partsminutes">DevOps Murdered Silos - A Tragedy in 3 Parts(minutes)</h1>
<h2 id="the-setup">The Setup</h2>
<p>This is the Tragedy of Steve. Steve is a CIO at Parts Unlimited and is in charge of many things including a bunch of developers and sys admins. The Developers in their developer silo write the code, and give it to the admins to run. The Admins in their silo try and the run the code. And when it fails (and it always fails) the finger pointing and blaming started. You know usual, THOSE people in the OTHER silo were always wrong and broken!</p>
<p>No matter what Steve did, he couldn’t get them to work together and just ship the software HE needed for Parts Unlimited to prosper…</p>
<p>And then one day, Steve read a book. A book about DevOps, Phoenixes and Projects! And lo! heavenly voices sang and angels descended from up on high, to spread the divine knowledge of The DevOps. He took the trinity of DevOps (Systems thinking, Feedback loops and Learning Mindset) and taught the Dev and Admin leads to work better.</p>
<h2 id="the-promise">The Promise</h2>
<p>And slowly, ever so slowly, things began to change.</p>
<p>At first, the developers and admins <em>tolerated</em> each other
Then they began to respect each other… and then oh my goodness …
then they began to seek each other out and collaborate!</p>
<p>The DevOps prophets were right! With Dev And Ops collaborating, he could get software out to the users faster than ever!!!</p>
<h2 id="the-fall">The Fall</h2>
<p>… And then one day</p>
<p>Steve saw a DEVELOPER talking to someone in SECURITY! He thought it was glorious! And another ADMIN talking to FINANCE. The DevOps was spreading, and surely even better things could happen by breaking down more silos…</p>
<p>but Steve was wrong …</p>
<p>That developer had to wait another week to finish the feature because he was busy collaborating with Security. And then he noticed more people in the CAB meetings. Every one was talking and collaborating but it was taking forever to come to a decision and <em>actually</em> get work done.</p>
<p>Steve asked one of the architects to explain how the software that ran Parts Unlimited worked. What he saw was a mess of spaghetti. Everything was dependent on something else, just like how his all his teams were now. All communicating with each other!</p>
<p>Steve asked the admins, “How long does it take to release the software?”. They answered “a week, maybe”. Steve thought, how could this be? That was exactly how long it used to take! DevOps has has improved NOTHING.</p>
<p>But worse happened still! and one day. Steve was asked to leave Parts Unlimited and never come back.</p>
<h2 id="the-regret">The Regret</h2>
<p>And it was only then that Steve realised his folly. That treacherous DevOps had not just broken down the silos but Murdered them. And with nothing to stop it, collaboration swept in, to suck the velocity out of his teams.</p>
<p>If only Steve had kept some of those silos alive and well. If only, in all his DevOps reading he headed the warning in Conways Law, and saw that when collaboration is left unchecked, it can manifest itself as horrific software. And Steve mourned the death of those innocent Silos that he helped kill.</p>
<hr />
<p>The moral of the story: Don’t go around destroying Silos just because they are Silos. Silos can also be useful barriers that temper interactions, not just blocking them.</p>Glenn Sarti(Perth DevOps Meetup) This lightning talk looks at Silos, and how DevOps killing them isn't always a good thingGetting Started with Gatling2021-09-14T00:00:00+08:002021-09-14T00:00:00+08:00https://sarti.dev/blog/starting-with-gatling<div style="font-size: 75%;">
<b>TLDR</b><br />
1. Download Gatling<br />
2. Install Java<br />
3. Copy a small simulation file<br />
4. Edit the simulation to suit your needs<br />
</div>
<hr />
<p>Recently I was working on an Azure Function Application and there were some performance requirements:</p>
<blockquote>
<ol>
<li>
<p>The Function Application must be able to service 750 connections in a day within normal response times</p>
</li>
<li>
<p>The Function Application must be able to service 100 simultaneous connections within normal response times</p>
</li>
</ol>
</blockquote>
<p><a href="https://gatling.io" target="_blank" class="align-right"><img src="https://sarti.dev/blog-images/gatling.png" alt="Gatling.io The best way to load test your applications" /></a></p>
<p>I was very sure the Function Application could do both, but how could I prove it? 🤔 I’ve been wanting to use the <a href="https://gatling.io">Gatling</a> tool for a while, and now seemed like the perfect opportunity to try it out!</p>
<h2 id="what-is-gatling">What is Gatling?</h2>
<p>So what is <a href="https://gatling.io">Gatling</a> then? From their own website:</p>
<blockquote>
<p>Gatling is a powerful open-source load testing solution.</p>
<p>Gatling is designed for continuous load testing and integrates with your development pipeline. Gatling includes a web recorder and colorful reports.</p>
</blockquote>
<p>Gatling can be used to make API, or general HTTP, calls in a large scale way. You can configure exactly what to send and how often. Gatling will then give you a nicely formatted aggregate of the results including some neat reports and graphs. I really liked that it was open-source, but more importantly it was <em>designed</em> to be used in continuous integration scenarios. Which meant I could share my test files with other developers in Source Control.</p>
<p>However, one thing that initially put me off was that the configuration files were written in <a href="https://www.scala-lang.org/">Scala</a>. I have never written Scala, nor did I want to learn, but the Gatling documentation and tutorials were really good. I mean <em>REALLY</em> good. I could understand from the examples what they were trying to achieve, and the documentation would help me find the bits I didn’t know. Here’s an example from their <a href="https://gatling.io/docs/gatling/tutorials/quickstart/#gatling-scenario-explained">tutorial</a>:</p>
<div class="language-scala highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="k">package</span> <span class="nn">computerdatabase</span> <span class="c1">// 1</span>
<span class="k">import</span> <span class="nn">scala.concurrent.duration._</span>
<span class="k">import</span> <span class="nn">io.gatling.core.Predef._</span> <span class="c1">// 2</span>
<span class="k">import</span> <span class="nn">io.gatling.http.Predef._</span>
<span class="k">class</span> <span class="nc">BasicSimulation</span> <span class="k">extends</span> <span class="nc">Simulation</span> <span class="o">{</span> <span class="c1">// 3</span>
<span class="k">val</span> <span class="nv">httpProtocol</span> <span class="k">=</span> <span class="n">http</span> <span class="c1">// 4</span>
<span class="o">.</span><span class="py">baseUrl</span><span class="o">(</span><span class="s">"http://computer-database.gatling.io"</span><span class="o">)</span> <span class="c1">// 5</span>
<span class="o">.</span><span class="py">acceptHeader</span><span class="o">(</span><span class="s">"text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"</span><span class="o">)</span> <span class="c1">// 6</span>
<span class="o">.</span><span class="py">doNotTrackHeader</span><span class="o">(</span><span class="s">"1"</span><span class="o">)</span>
<span class="o">.</span><span class="py">acceptLanguageHeader</span><span class="o">(</span><span class="s">"en-US,en;q=0.5"</span><span class="o">)</span>
<span class="o">.</span><span class="py">acceptEncodingHeader</span><span class="o">(</span><span class="s">"gzip, deflate"</span><span class="o">)</span>
<span class="o">.</span><span class="py">userAgentHeader</span><span class="o">(</span><span class="s">"Mozilla/5.0 (Windows NT 5.1; rv:31.0) Gecko/20100101 Firefox/31.0"</span><span class="o">)</span>
<span class="k">val</span> <span class="nv">scn</span> <span class="k">=</span> <span class="nf">scenario</span><span class="o">(</span><span class="s">"BasicSimulation"</span><span class="o">)</span> <span class="c1">// 7</span>
<span class="o">.</span><span class="py">exec</span><span class="o">(</span>
<span class="nf">http</span><span class="o">(</span><span class="s">"request_1"</span><span class="o">)</span> <span class="c1">// 8</span>
<span class="o">.</span><span class="py">get</span><span class="o">(</span><span class="s">"/"</span><span class="o">)</span>
<span class="o">)</span> <span class="c1">// 9</span>
<span class="o">.</span><span class="py">pause</span><span class="o">(</span><span class="mi">5</span><span class="o">)</span> <span class="c1">// 10</span>
<span class="nf">setUp</span><span class="o">(</span> <span class="c1">// 11</span>
<span class="nv">scn</span><span class="o">.</span><span class="py">inject</span><span class="o">(</span><span class="nf">atOnceUsers</span><span class="o">(</span><span class="mi">1</span><span class="o">))</span> <span class="c1">// 12</span>
<span class="o">).</span><span class="py">protocols</span><span class="o">(</span><span class="n">httpProtocol</span><span class="o">)</span> <span class="c1">// 13</span>
<span class="o">}</span>
</code></pre></div></div>
<p>Even with only a quick glance, you can kind of make out what it was trying to do</p>
<ul>
<li>Set the Base URL (4)</li>
<li>Add some Headers to requests (5 & 6)</li>
<li>Set the HTTP Request Method (8 & 9)</li>
<li>Setup how many requests to send (<code class="language-plaintext highlighter-rouge">atOnceUsers(1)</code>) (12)</li>
</ul>
<p>And for most people starting out with Gatling, that’s enough. So let’s get started with Gatling and create our own test.</p>
<h2 id="installing-gatling">Installing Gatling</h2>
<p>Gatling is offered in two flavours; the Enterprise edition, and the <a href="https://gatling.io/open-source/#downloadgatling">Open Source edition</a> which is the one we will be using. This downloads a zip file with a basic Gatling Installation. At the time of writing, version 3.6.1 is the latest. Extract out the ZIP file somewhere, in my case <code class="language-plaintext highlighter-rouge">C:\Source\gatling-charts-highcharts-bundle-3.6.1-bundle</code> and then run Gatling;</p>
<div class="language-text highlighter-rouge"><div class="highlight"><pre class="highlight"><code>C:\Source\gatling-charts-highcharts-bundle-3.6.1-bundle> .\bin\gatling.bat
GATLING_HOME is set to "C:\Source\gatling-charts-highcharts-bundle-3.6.1-bundle"
JAVA = "java"
'java' is not recognized as an internal or external command,
operable program or batch file.
Press any key to continue . . .
</code></pre></div></div>
<p>Oh no! So Scala runs in a Java Virtual Machine (JVM) so of <em>course</em> it needs Java. Gatling lists the following compatible Java versions:</p>
<blockquote>
<p>Gatling supports 64bits OpenJDK 8 and OpenJDK 11 with HotSpot. Other JVMs such as JDK 12+, client JVMs, 32bits systems or OpenJ9 are not supported.</p>
</blockquote>
<p>So download an install an appropriate version of Java (Due to the licensing shenanigans with Java, I can’t really recommend one for you). Once it’s installed we can run Gatling again:</p>
<div class="language-text highlighter-rouge"><div class="highlight"><pre class="highlight"><code>C:\Source\gatling-charts-highcharts-bundle-3.6.1-bundle> bin\gatling
GATLING_HOME is set to "C:\Source\gatling-charts-highcharts-bundle-3.6.1-bundle"
JAVA = "java"
Choose a simulation number:
[0] computerdatabase.BasicSimulation
[1] computerdatabase.advanced.AdvancedSimulationStep01
[2] computerdatabase.advanced.AdvancedSimulationStep02
[3] computerdatabase.advanced.AdvancedSimulationStep03
[4] computerdatabase.advanced.AdvancedSimulationStep04
[5] computerdatabase.advanced.AdvancedSimulationStep05
</code></pre></div></div>
<p>Nice! The default Gatling installation comes with some <a href="https://gatling.io/docs/gatling/tutorials/advanced/">default scenarios</a> so you can start straight away. These are in the <code class="language-plaintext highlighter-rouge">user-files/simulations</code> directory. But this is a getting started guide, so lets create our own simulation file.</p>
<h2 id="creating-our-own-simulation">Creating our own Simulation</h2>
<p>So lets create a simulation to call my blog (https://sarti.dev). First we need to create the simulation file. Create a file called <code class="language-plaintext highlighter-rouge">sartidev_test1.scala</code></p>
<div class="language-scala highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="k">package</span> <span class="nn">gatlingBlog</span>
<span class="k">import</span> <span class="nn">scala.concurrent.duration._</span>
<span class="k">import</span> <span class="nn">io.gatling.core.Predef._</span>
<span class="k">import</span> <span class="nn">io.gatling.http.Predef._</span>
<span class="k">class</span> <span class="nc">SartiDevSimulation1</span> <span class="k">extends</span> <span class="nc">Simulation</span> <span class="o">{</span>
<span class="k">val</span> <span class="nv">httpProtocol</span> <span class="k">=</span> <span class="n">http</span>
<span class="o">.</span><span class="py">baseUrl</span><span class="o">(</span><span class="s">"https://sarti.dev"</span><span class="o">)</span>
<span class="k">val</span> <span class="nv">scn</span> <span class="k">=</span> <span class="nf">scenario</span><span class="o">(</span><span class="s">"SendSimpleQuery"</span><span class="o">)</span>
<span class="o">.</span><span class="py">exec</span><span class="o">(</span>
<span class="nf">http</span><span class="o">(</span><span class="s">"root_request"</span><span class="o">)</span>
<span class="o">.</span><span class="py">get</span><span class="o">(</span><span class="s">"/"</span><span class="o">)</span>
<span class="o">)</span>
<span class="nf">setUp</span><span class="o">(</span><span class="nv">scn</span><span class="o">.</span><span class="py">inject</span><span class="o">(</span>
<span class="nf">atOnceUsers</span><span class="o">(</span><span class="mi">1</span><span class="o">)</span>
<span class="o">).</span><span class="py">protocols</span><span class="o">(</span><span class="n">httpProtocol</span><span class="o">))</span>
<span class="o">}</span>
</code></pre></div></div>
<p>Let’s break this down;</p>
<div class="language-scala highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="k">package</span> <span class="nn">gatlingBlog</span>
</code></pre></div></div>
<p>The optional package statement is used to logically group simulations together. In this case, all of the simulations for this blog post will be in the <code class="language-plaintext highlighter-rouge">gatlingBlog</code> package.</p>
<div class="language-scala highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="k">import</span> <span class="nn">scala.concurrent.duration._</span>
<span class="k">import</span> <span class="nn">io.gatling.core.Predef._</span>
<span class="k">import</span> <span class="nn">io.gatling.http.Predef._</span>
</code></pre></div></div>
<p>Next we import the default Gatling Scala packages so we can use them.</p>
<div class="language-scala highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="k">class</span> <span class="nc">SartiDevSimulation1</span> <span class="k">extends</span> <span class="nc">Simulation</span> <span class="o">{</span>
<span class="o">...</span>
<span class="o">}</span>
</code></pre></div></div>
<p>Then we create our simulation called <code class="language-plaintext highlighter-rouge">SartiDevSimulation1</code>. This is a Scala class and extends (or “inherits from”) the Gatling Simulation class. This is how some of the “magic” happens later. Now we start defining our simulation</p>
<div class="language-scala highlighter-rouge"><div class="highlight"><pre class="highlight"><code> <span class="k">val</span> <span class="nv">httpProtocol</span> <span class="k">=</span> <span class="n">http</span>
<span class="o">.</span><span class="py">baseUrl</span><span class="o">(</span><span class="s">"https://sarti.dev"</span><span class="o">)</span>
</code></pre></div></div>
<p>We create a protocol object, in this case for the HTTP protocol, and assign it a URL which will be the base for any other queries. This is also where you would put request headers and we’ll show that later. For now, we only need the URL.</p>
<div class="language-scala highlighter-rouge"><div class="highlight"><pre class="highlight"><code> <span class="k">val</span> <span class="nv">scn</span> <span class="k">=</span> <span class="nf">scenario</span><span class="o">(</span><span class="s">"SendSimpleQuery"</span><span class="o">)</span>
<span class="o">.</span><span class="py">exec</span><span class="o">(</span>
<span class="nf">http</span><span class="o">(</span><span class="s">"root_request"</span><span class="o">)</span>
<span class="o">.</span><span class="py">get</span><span class="o">(</span><span class="s">"/"</span><span class="o">)</span>
<span class="o">)</span>
</code></pre></div></div>
<p>Next we need to define the test scenario. In this case, we are just doing a plain old HTTP GET request to the root URL. This is exactly what a browser would request when you open my blog. This scenario can be extended with multiple calls or POSTs or delays. For now though a simple GET request will do.</p>
<div class="language-scala highlighter-rouge"><div class="highlight"><pre class="highlight"><code> <span class="nf">setUp</span><span class="o">(</span><span class="nv">scn</span><span class="o">.</span><span class="py">inject</span><span class="o">(</span>
<span class="nf">atOnceUsers</span><span class="o">(</span><span class="mi">1</span><span class="o">)</span>
<span class="o">).</span><span class="py">protocols</span><span class="o">(</span><span class="n">httpProtocol</span><span class="o">))</span>
</code></pre></div></div>
<p>And lastly we setup the scenario to send only one request (<code class="language-plaintext highlighter-rouge">atOnceUsers(1)</code>) over the protocol we configured. This is where we can set the number of sustained concurrent connections or delays. For now we just want to test the simulation even works.</p>
<p>Now lets run the simulation. We pass in the simulation directory for our new file on the command line. For this example, I created my Gatling files in <code class="language-plaintext highlighter-rouge">C:\Source\code-glennsarti.github.io\gatling</code></p>
<div class="language-text highlighter-rouge"><div class="highlight"><pre class="highlight"><code>C:\Source\gatling-charts-highcharts-bundle-3.6.1-bundle> .\bin\gatling.bat --simulations-folder C:\Source\code-glennsarti.github.io\gatling\
GATLING_HOME is set to "C:\Source\gatling-charts-highcharts-bundle-3.6.1-bundle"
JAVA = "java"
gatlingBlog.SartiDevSimulation1 is the only simulation, executing it.
Select run description (optional)
Simulation gatlingBlog.SartiDevSimulation1 started...
================================================================================
2021-09-13 21:14:15 1s elapsed
---- Requests ------------------------------------------------------------------
> Global (OK=1 KO=0 )
> root_request (OK=1 KO=0 )
---- SendSimpleQuery -----------------------------------------------------------
[##########################################################################]100%
waiting: 0 / active: 0 / done: 1
================================================================================
Simulation gatlingBlog.SartiDevSimulation1 completed in 1 seconds
Parsing log file(s)...
Parsing log file(s) done
Generating reports...
================================================================================
---- Global Information --------------------------------------------------------
> request count 1 (OK=1 KO=0 )
> min response time 1086 (OK=1086 KO=- )
> max response time 1086 (OK=1086 KO=- )
> mean response time 1086 (OK=1086 KO=- )
> std deviation 0 (OK=0 KO=- )
> response time 50th percentile 1086 (OK=1086 KO=- )
> response time 75th percentile 1086 (OK=1086 KO=- )
> response time 95th percentile 1086 (OK=1086 KO=- )
> response time 99th percentile 1086 (OK=1086 KO=- )
> mean requests/sec 0.5 (OK=0.5 KO=- )
---- Response Time Distribution ------------------------------------------------
> t < 800 ms 0 ( 0%)
> 800 ms < t < 1200 ms 1 (100%)
> t > 1200 ms 0 ( 0%)
> failed 0 ( 0%)
================================================================================
Reports generated in 0s.
Please open the following file: C:\Source\gatling-charts-highcharts-bundle-3.6.1-bundle\results\sartidevsimulation1-20210913131412780\index.html
Press any key to continue . . .
</code></pre></div></div>
<p>There’s a lot of text there but the important part is that the request was successful. How do we know? because of the request count</p>
<div class="language-text highlighter-rouge"><div class="highlight"><pre class="highlight"><code>---- Global Information --------------------------------------------------------
> request count 1 (OK=1 KO=0 )
</code></pre></div></div>
<p><code class="language-plaintext highlighter-rouge">OK=1</code> means there was one request that returned a successful HTTP response code (200 OK)</p>
<p>And the response time from my blog was 1086ms (🎉 Yay for Australian Internet and vast distances)</p>
<div class="language-text highlighter-rouge"><div class="highlight"><pre class="highlight"><code>> mean response time 1086 (OK=1086 KO=- )
</code></pre></div></div>
<p>This all nice, but doesn’t help. What if we sent 100 simultaneous requests? Well we change one line;</p>
<p>From</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>atOnceUsers(1)
</code></pre></div></div>
<p>To</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>atOnceUsers(100)
</code></pre></div></div>
<p>And we run the simulation again using the same command line and we get the following results:</p>
<div class="language-text highlighter-rouge"><div class="highlight"><pre class="highlight"><code>================================================================================
---- Global Information --------------------------------------------------------
> request count 100 (OK=100 KO=0 )
> min response time 631 (OK=631 KO=- )
> max response time 2954 (OK=2954 KO=- )
> mean response time 1405 (OK=1405 KO=- )
> std deviation 371 (OK=371 KO=- )
> response time 50th percentile 1468 (OK=1468 KO=- )
> response time 75th percentile 1643 (OK=1643 KO=- )
> response time 95th percentile 1913 (OK=1913 KO=- )
> response time 99th percentile 2298 (OK=2298 KO=- )
> mean requests/sec 25 (OK=25 KO=- )
---- Response Time Distribution ------------------------------------------------
> t < 800 ms 3 ( 3%)
> 800 ms < t < 1200 ms 25 ( 25%)
> t > 1200 ms 72 ( 72%)
> failed 0 ( 0%)
================================================================================
</code></pre></div></div>
<p>Firstly we see 100 successful requests were made (<code class="language-plaintext highlighter-rouge">OK=100</code>) and a really nice breakdown of how my blog is responding if I had a sudden influx of people. But what if I had 100 simultaneous connections for 10 minutes. Well again we change that one line:</p>
<p>From</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>atOnceUsers(100)
</code></pre></div></div>
<p>To</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>constantConcurrentUsers(100).during(10.minutes)
</code></pre></div></div>
<p>Ok, so this looks a little different. This injection will continuously keep 100 connections active to the blog. This means every time a connection finishes it will start a new connection to take its place, until it reaches the 100 limit. And it will keep doing this for ten (10) minutes.</p>
<p>And again, running Gatling with the same command line:</p>
<div class="language-text highlighter-rouge"><div class="highlight"><pre class="highlight"><code>================================================================================
---- Global Information --------------------------------------------------------
> request count 59968 (OK=59967 KO=1 )
> min response time 82 (OK=82 KO=11023 )
> max response time 13494 (OK=13494 KO=11023 )
> mean response time 1000 (OK=1000 KO=11023 )
> std deviation 513 (OK=511 KO=0 )
> response time 50th percentile 1048 (OK=1048 KO=11023 )
> response time 75th percentile 1235 (OK=1235 KO=11023 )
> response time 95th percentile 1682 (OK=1682 KO=11023 )
> response time 99th percentile 2681 (OK=2681 KO=11023 )
> mean requests/sec 99.615 (OK=99.613 KO=0.002 )
---- Response Time Distribution ------------------------------------------------
> t < 800 ms 22926 ( 38%)
> 800 ms < t < 1200 ms 19412 ( 32%)
> t > 1200 ms 17629 ( 29%)
> failed 1 ( 0%)
---- Errors --------------------------------------------------------------------
> i.n.h.s.SslHandshakeTimeoutException: handshake timed out afte 1 (100.0%)
r 10000ms
================================================================================
</code></pre></div></div>
<p>There was one failure (<code class="language-plaintext highlighter-rouge">KO=1</code>) and Gatling nicely tells us why (SSL Handshake Timeout).</p>
<p>Gatling also outputs some really nice graphs of the results. On the very first example you may have noticed the text at the bottom saying <code class="language-plaintext highlighter-rouge">Please open the following file: C:\Source\gatling-charts-highcharts-bundle-3.6.1-bundle\results\s....</code>. This is where Gatling outputs the HTML reports. This report file would probably be published in a CI/CD pipeline. So lets look at the report for the most recent run:</p>
<p><em>General Overview</em></p>
<p><img src="https://sarti.dev/blog-images/gatling1.png" alt="Gatling overview report" class="align-center" /></p>
<p><em>Response Time Distribution</em></p>
<p>551ms response was the most common</p>
<p><img src="https://sarti.dev/blog-images/gatling2.png" alt="Gatling response distribution" class="align-center" /></p>
<p><em>Response Time over the test period</em></p>
<p>This graph is probably the more interesting as it shows when I received the HTTP error (The red arrow). You can see at that point the response time increased (The multicoloured spikes) and the number of active users dropped (The orange line)</p>
<p><img src="https://sarti.dev/blog-images/gatling3.png" alt="Gatling response time over the test period" class="align-center" /></p>
<p>But wait, that graph says I had 200 Active Users, not 100. What gives?</p>
<p>Well, this is due to the Active Users being sampled during one second. Remember that the most common response time was 551ms? Well that means in one second, two connections could be started. That is, in that one second there were two active users. Now what if I had 100 connections going? Well that meant it was likely this was happening to <strong>all</strong> 100 connections so, 200 Active Users would be average. And that’s what we see in the graph. Really the the name should be “Active Users during a one second window”, but that’s too long to fit on the graph title.</p>
<h2 id="different-injection-types">Different injection types</h2>
<p>So far we’ve only used <a href="https://gatling.io/docs/gatling/reference/current/general/simulation_setup/#open-model"><code class="language-plaintext highlighter-rouge">atOnceUsers</code></a> and <a href="https://gatling.io/docs/gatling/reference/current/general/simulation_setup/#closed-model"><code class="language-plaintext highlighter-rouge">constantUsersPerSec</code></a>, but the <a href="https://gatling.io/docs/gatling/reference/current/general/simulation_setup">Gatling Documentation</a> lists many others that you can use, for example:</p>
<p><code class="language-plaintext highlighter-rouge">nothingFor(duration)</code></p>
<p>Pauses the simulation for a period of time</p>
<p><code class="language-plaintext highlighter-rouge">rampUsersPerSec(rate1) to (rate2) during(duration)</code></p>
<p>Injects users from starting rate to target rate, defined in users per second, during a given duration. Users will be injected at regular intervals.</p>
<p>Remember that the injections happen sequentially and in-order so you can setup a fairly complex sequence of events for testing, for example, ramp up from 0 to 50 users over 20 minutes, then sustain 50 users for 30 minutes, and finally ramp down from 50 to 0 users over 2 minutes</p>
<h2 id="different-request-information">Different request information</h2>
<p>The example above only had a basic request (GET with no headers) but really most requests will have some specific information you need to send, particularly for REST API requests. Let’s say we wanted to call an Azure API Management (APMIM) hosted URL. That needs;</p>
<ul>
<li>Header with the APIM Subscription Key</li>
<li>Content-Type header to specify what is in the payload</li>
<li>A JSON String Payload <code class="language-plaintext highlighter-rouge">{"userid": 25,"name": "Glenn Sarti"}</code></li>
</ul>
<p>The scenario would look like;</p>
<div class="language-scala highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="k">val</span> <span class="nv">scn</span> <span class="k">=</span> <span class="nf">scenario</span><span class="o">(</span><span class="s">"UpdateUserRequest"</span><span class="o">)</span>
<span class="o">.</span><span class="py">exec</span><span class="o">(</span>
<span class="nf">http</span><span class="o">(</span><span class="s">"UpdateUserRequest"</span><span class="o">)</span>
<span class="o">.</span><span class="py">post</span><span class="o">(</span><span class="s">"/api/v1/updateUser"</span><span class="o">)</span>
<span class="o">.</span><span class="py">header</span><span class="o">(</span><span class="s">"Ocp-Apim-Subscription-Key"</span><span class="o">,</span> <span class="s">"aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa"</span><span class="o">)</span>
<span class="o">.</span><span class="py">header</span><span class="o">(</span><span class="s">"Content-Type"</span><span class="o">,</span> <span class="s">"application/json"</span><span class="o">)</span>
<span class="o">.</span><span class="py">body</span><span class="o">(</span><span class="nc">StringBody</span><span class="o">(</span><span class="s">"""{"userid": 25,"name": "Glenn Sarti"}"""</span><span class="o">))</span>
<span class="o">)</span>
</code></pre></div></div>
<ul>
<li>
<p>Uses the HTTP <a href="https://gatling.io/docs/gatling/reference/current/http/request/#method-and-url">POST</a> method</p>
</li>
<li>
<p>Sets up multiple <a href="https://gatling.io/docs/gatling/reference/current/http/request/#headers">Request Headers</a></p>
</li>
<li>
<p>Sets the <a href="https://gatling.io/docs/gatling/reference/current/http/request/#request-body">Request Body as a string</a></p>
</li>
</ul>
<p>You can also use a JSON file instead of a string for the body. For example;</p>
<div class="language-scala highlighter-rouge"><div class="highlight"><pre class="highlight"><code> <span class="o">.</span><span class="py">body</span><span class="o">(</span><span class="nc">RawFileBody</span><span class="o">(</span><span class="s">"C:/Source/UserRequest.json"</span><span class="o">))</span>
</code></pre></div></div>
<h2 id="default-scenario-settings">Default scenario settings</h2>
<p>You can also specify defaults for all scenarios by setting them at the protocol layer. For example;</p>
<div class="language-scala highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="k">val</span> <span class="nv">httpProtocol</span> <span class="k">=</span> <span class="n">http</span>
<span class="o">.</span><span class="py">baseUrl</span><span class="o">(</span><span class="s">"https://sarti.dev"</span><span class="o">)</span>
<span class="o">.</span><span class="py">acceptHeader</span><span class="o">(</span><span class="s">"text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"</span><span class="o">)</span>
<span class="o">.</span><span class="py">doNotTrackHeader</span><span class="o">(</span><span class="s">"1"</span><span class="o">)</span>
<span class="o">.</span><span class="py">acceptLanguageHeader</span><span class="o">(</span><span class="s">"en-US,en;q=0.5"</span><span class="o">)</span>
<span class="o">.</span><span class="py">acceptEncodingHeader</span><span class="o">(</span><span class="s">"gzip, deflate"</span><span class="o">)</span>
<span class="o">.</span><span class="py">userAgentHeader</span><span class="o">(</span><span class="s">"Mozilla/5.0 (Macintosh; Intel Mac OS X 10.8; rv:16.0) Gecko/20100101 Firefox/16.0"</span><span class="o">)</span>
</code></pre></div></div>
<h2 id="wrapping-up">Wrapping up</h2>
<p>Gatling can be a very complicated and powerful tool for doing load testing, but starting out doesn’t have to be too difficult. Start out small and simple, and then use the excellent <a href="https://gatling.io/docs/gatling/reference/current/general/concepts/">Gatling documentation</a> to help you craft the exact testing scenarios you want to try out.</p>
<p>The gatling simulation file is available on my GitHub repo for my blog <a href="https://github.com/glennsarti/code-glennsarti.github.io/tree/master/gatling">https://github.com/glennsarti/code-glennsarti.github.io/tree/master/gatling</a>.</p>Glenn SartiHow to do simple load testing with the Gatling toolDynamic Test Matrices2021-01-13T00:00:00+08:002021-01-13T00:00:00+08:00https://sarti.dev/blog/dynamic-test-matrix<h1 id="the-problem">The problem</h1>
<p>I was working on the <a href="https://github.com/puppetlabs/puppet-editor-services">Puppet Editor Services</a> project, moving the automated CI pipelines from Travis and AppVeyor over to GitHub Actions. The project uses different combinations of Operating Systems, Ruby versions and Puppet versions to test one. This was achieved using <a href="https://docs.travis-ci.com/user/build-matrix/">Travis Build Matrix</a> and <a href="https://www.appveyor.com/docs/build-configuration/#build-matrix">AppVeyor Build Matrix</a>. But when I tried to the same in <a href="https://docs.github.com/en/free-pro-team@latest/actions/reference/workflow-syntax-for-github-actions#jobsjob_idstrategymatrix">GitHub Actions Matrix</a> it just didn’t work the same 😢.</p>
<p>In short - You can’t add matrix entries that share the same keys. Instead you have to explicitly list out all combinations</p>
<p>In my case, I had the following test cases in the matrix</p>
<div class="language-yaml highlighter-rouge"><div class="highlight"><pre class="highlight"><code> <span class="na">os</span><span class="pi">:</span> <span class="pi">[</span><span class="s1">'</span><span class="s">windows-latest'</span><span class="pi">,</span> <span class="s1">'</span><span class="s">ubuntu-latest'</span><span class="pi">]</span>
<span class="na">ruby</span><span class="pi">:</span> <span class="pi">[</span><span class="s1">'</span><span class="s">2.4'</span><span class="pi">,</span> <span class="s1">'</span><span class="s">2.5'</span><span class="pi">,</span> <span class="s1">'</span><span class="s">2.7'</span><span class="pi">]</span>
<span class="na">command</span><span class="pi">:</span> <span class="pi">[</span><span class="s1">'</span><span class="s">...</span><span class="nv"> </span><span class="s">run</span><span class="nv"> </span><span class="s">all</span><span class="nv"> </span><span class="s">tests</span><span class="nv"> </span><span class="s">...'</span><span class="pi">]</span>
</code></pre></div></div>
<p>This would run the tests for all the combinations of Operating System and Ruby Version. This worked great, however I also needed to add a specific Puppet Version, and therefore Ruby Version, test, as there is a regression in Puppet that needed to be tested. Also, we only needed to run a subset of the tests, not the whole suite. So the matrix now looked like:</p>
<div class="language-yaml highlighter-rouge"><div class="highlight"><pre class="highlight"><code> <span class="na">os</span><span class="pi">:</span> <span class="pi">[</span><span class="s1">'</span><span class="s">windows-latest'</span><span class="pi">,</span> <span class="s1">'</span><span class="s">ubuntu-latest'</span><span class="pi">]</span>
<span class="na">ruby</span><span class="pi">:</span> <span class="pi">[</span><span class="s1">'</span><span class="s">2.4'</span><span class="pi">,</span> <span class="s1">'</span><span class="s">2.5'</span><span class="pi">,</span> <span class="s1">'</span><span class="s">2.7'</span><span class="pi">]</span>
<span class="na">command</span><span class="pi">:</span> <span class="pi">[</span><span class="s1">'</span><span class="s">...</span><span class="nv"> </span><span class="s">run</span><span class="nv"> </span><span class="s">all</span><span class="nv"> </span><span class="s">tests</span><span class="nv"> </span><span class="s">...'</span><span class="pi">]</span>
<span class="na">include</span><span class="pi">:</span>
<span class="pi">-</span> <span class="na">os</span><span class="pi">:</span> <span class="s1">'</span><span class="s">ubuntu-latest'</span>
<span class="na">ruby </span><span class="pi">:</span> <span class="s1">'</span><span class="s">2.5'</span>
<span class="na">puppet_version</span><span class="pi">:</span> <span class="s1">'</span><span class="s">5.1.0'</span>
<span class="na">command </span><span class="pi">:</span> <span class="pi">[</span><span class="s1">'</span><span class="s">...</span><span class="nv"> </span><span class="s">only</span><span class="nv"> </span><span class="s">run</span><span class="nv"> </span><span class="s">unit</span><span class="nv"> </span><span class="s">tests</span><span class="nv"> </span><span class="s">...'</span><span class="pi">]</span>
</code></pre></div></div>
<p>However GA (GitHub Actions) would then not run the “All Tests” command for Ruby 2.5 on ubuntu-latest. This was because the matrix was not adding, but overwriting. A quick search on this and I discovered this a known “thing” with the matrix feature in GA. I tried to re-architect the matrix and rethink how I would do the testing but nothing I came up with really worked or was just too horrible to maintain.</p>
<h2 id="partial-solution">Partial solution</h2>
<p>The solution offered by GA was to not use a matrix at all, but define all the combinations explicitly. This meant turning my example 3 line matrix into a 24 line hardcoded list. This didn’t take into account the other tests that I hadn’t mentioned yet! I tried YAML anchors to reduce the copying of information but GA doesn’t support YAML anchors.</p>
<p>Sure this <em>could</em> work but it seemed like hard-coding is a bad way to go. So more searching and I came across a post about <a href="https://github.community/t/how-to-share-matrix-between-jobs/128595">sharing matrix information between jobs</a>, in particular this YAML example:</p>
<div class="language-yaml highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="na">job1</span><span class="pi">:</span>
<span class="na">runs-on</span><span class="pi">:</span> <span class="s">ubuntu-latest</span>
<span class="na">outputs</span><span class="pi">:</span>
<span class="na">matrix</span><span class="pi">:</span> <span class="s">${{ steps.set-matrix.outputs.matrix }}</span>
<span class="na">steps</span><span class="pi">:</span>
<span class="pi">-</span> <span class="na">id</span><span class="pi">:</span> <span class="s">set-matrix</span>
<span class="na">run</span><span class="pi">:</span> <span class="pi">|</span>
<span class="s">echo "::set-output name=matrix::[{\"go\":\"1.13\",\"commit\":\"v1.0.0\"},{\"go\":\"1.14\",\"commit\":\"v1.2.0\"}]"</span>
<span class="na">builder</span><span class="pi">:</span>
<span class="na">needs</span><span class="pi">:</span> <span class="s">job1</span>
<span class="na">runs-on</span><span class="pi">:</span> <span class="s">ubuntu-latest</span>
<span class="na">strategy</span><span class="pi">:</span>
<span class="na">matrix</span><span class="pi">:</span>
<span class="na">cfg</span><span class="pi">:</span> <span class="s">${{fromJson(needs.job1.outputs.matrix)}}</span>
<span class="na">steps</span><span class="pi">:</span>
<span class="pi">-</span> <span class="na">run</span><span class="pi">:</span> <span class="pi">|</span>
<span class="s">echo bin-${{ matrix.cfg.go }}-${{ matrix.cfg.commit }}</span>
</code></pre></div></div>
<p>So what’s going on here? Well this line looks very interesting:</p>
<div class="language-yaml highlighter-rouge"><div class="highlight"><pre class="highlight"><code> <span class="na">matrix</span><span class="pi">:</span>
<span class="na">cfg</span><span class="pi">:</span> <span class="s">${{fromJson(needs.job1.outputs.matrix)}}</span>
</code></pre></div></div>
<p>Instead of the matrix configuration being defined in YAML, it was consuming the JSON output from another job?!?! So the execution of the job looked like:</p>
<ul>
<li><code class="language-plaintext highlighter-rouge">job1</code> starts
<ul>
<li>Runs the step <code class="language-plaintext highlighter-rouge">set-matrix</code></li>
<li>The <code class="language-plaintext highlighter-rouge">set-matrix</code> step outputs a step variable called <code class="language-plaintext highlighter-rouge">matrix</code>. This is a compact JSON string.
See the <a href="https://docs.github.com/en/free-pro-team@latest/actions/reference/workflow-commands-for-github-actions#setting-an-output-parameter">set-output</a> documentation for more information about how to set output parameters</li>
</ul>
</li>
<li>The <code class="language-plaintext highlighter-rouge">builder</code> job starts after <code class="language-plaintext highlighter-rouge">job1</code> completes
<ul>
<li>The matrix configuration is deserialised from the JSON string value <code class="language-plaintext highlighter-rouge">needs.job1.outputs.matrix</code></li>
<li>The steps are then run for each entry on the matrix</li>
</ul>
</li>
</ul>
<p class="notice--warning">The example uses the <code class="language-plaintext highlighter-rouge">matrix.cfg</code> setting, but GA now supports using <code class="language-plaintext highlighter-rouge">matrix.include</code> directly. You’ll see this later in this post</p>
<p>This means we can use an arbitrary JSON file to configure the matrix! But so what; Having a hardcoded JSON string that’s read by a job is no different than hardcoding the YAML in the first place!</p>
<p>… But</p>
<p>… What if the JSON string wasn’t hardcoded?</p>
<p>… What if the JSON string was created on the fly? by a PowerShell script? 🤔</p>
<h2 id="crafting-test-matrices">Crafting Test Matrices</h2>
<p>So lets start with a fresh GA Workflow called <code class="language-plaintext highlighter-rouge">dynamic-test-matrix</code>.</p>
<div class="language-yaml highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="na">name</span><span class="pi">:</span> <span class="s">dynamic-test-matrix</span>
<span class="na">on</span><span class="pi">:</span>
<span class="na">push</span><span class="pi">:</span>
<span class="na">branches</span><span class="pi">:</span>
<span class="pi">-</span> <span class="s">main</span>
<span class="na">pull_request</span><span class="pi">:</span>
<span class="na">branches</span><span class="pi">:</span>
<span class="pi">-</span> <span class="s">main</span>
<span class="na">jobs</span><span class="pi">:</span>
<span class="na">matrix</span><span class="pi">:</span>
<span class="na">name</span><span class="pi">:</span> <span class="s">Generate test matrix</span>
<span class="na">runs-on</span><span class="pi">:</span> <span class="s">ubuntu-latest</span>
<span class="na">outputs</span><span class="pi">:</span>
<span class="na">matrix-json</span><span class="pi">:</span> <span class="s">${{ steps.set-matrix.outputs.matrix }}</span>
<span class="na">steps</span><span class="pi">:</span>
<span class="pi">-</span> <span class="na">uses</span><span class="pi">:</span> <span class="s">actions/checkout@v2</span>
<span class="pi">-</span> <span class="na">id</span><span class="pi">:</span> <span class="s">set-matrix</span>
<span class="na">shell</span><span class="pi">:</span> <span class="s">pwsh</span>
<span class="c1"># Use a small PowerShell script to generate the test matrix</span>
<span class="na">run</span><span class="pi">:</span> <span class="s2">"</span><span class="s">&</span><span class="nv"> </span><span class="s">.github/workflows/create-test-matrix.ps1"</span>
<span class="na">run-matrix</span><span class="pi">:</span>
<span class="na">needs</span><span class="pi">:</span> <span class="pi">[</span><span class="nv">matrix</span><span class="pi">]</span>
<span class="na">strategy</span><span class="pi">:</span>
<span class="na">fail-fast</span><span class="pi">:</span> <span class="no">false</span>
<span class="na">matrix</span><span class="pi">:</span>
<span class="na">include</span><span class="pi">:</span> <span class="s">${{ fromJson(needs.matrix.outputs.matrix-json) }}</span>
<span class="na">name</span><span class="pi">:</span> <span class="s2">"</span><span class="s">${{</span><span class="nv"> </span><span class="s">matrix.job_name</span><span class="nv"> </span><span class="s">}}"</span>
<span class="na">runs-on</span><span class="pi">:</span> <span class="s">${{ matrix.os }}</span>
<span class="na">steps</span><span class="pi">:</span>
<span class="pi">-</span> <span class="na">uses</span><span class="pi">:</span> <span class="s">actions/checkout@v2</span>
<span class="pi">-</span> <span class="na">name</span><span class="pi">:</span> <span class="s">Run Command</span>
<span class="na">shell</span><span class="pi">:</span> <span class="s">pwsh</span>
<span class="na">run</span><span class="pi">:</span> <span class="pi">|</span>
<span class="s">Write-Host "Run '${{ matrix.command }}'"</span>
</code></pre></div></div>
<p>Just like the example before there are two jobs:</p>
<ul>
<li><code class="language-plaintext highlighter-rouge">matrix</code> : Runs a PowerShell script to create the test matrix</li>
<li><code class="language-plaintext highlighter-rouge">run-matrix</code> : Pretends to the run the command for each item in the matrix</li>
</ul>
<h3 id="the-matrices-creation-job">The matrices creation job</h3>
<p>Let’s look the <code class="language-plaintext highlighter-rouge">matrix</code> job:</p>
<div class="language-yaml highlighter-rouge"><div class="highlight"><pre class="highlight"><code> <span class="na">name</span><span class="pi">:</span> <span class="s">Generate test matrix</span>
</code></pre></div></div>
<p>A nice friendly name in the UI</p>
<div class="language-yaml highlighter-rouge"><div class="highlight"><pre class="highlight"><code> <span class="na">runs-on</span><span class="pi">:</span> <span class="s">ubuntu-latest</span>
</code></pre></div></div>
<p>The job will run on an Ubuntu based runner. Why Ubuntu instead of Windows? There is no reason why it has to be Windows specific so why not make it a cross platform script.</p>
<div class="language-yaml highlighter-rouge"><div class="highlight"><pre class="highlight"><code> <span class="na">outputs</span><span class="pi">:</span>
<span class="na">matrix-json</span><span class="pi">:</span> <span class="s">${{ steps.set-matrix.outputs.matrix }}</span>
</code></pre></div></div>
<p>This instructs GA that it will output a Job parameter called <code class="language-plaintext highlighter-rouge">matrix-json</code>, and that its value comes from the step output <code class="language-plaintext highlighter-rouge">matrix</code>, from the step called <code class="language-plaintext highlighter-rouge">set-matrix</code>.</p>
<p>Next comes the steps for this job</p>
<div class="language-yaml highlighter-rouge"><div class="highlight"><pre class="highlight"><code> <span class="na">steps</span><span class="pi">:</span>
<span class="pi">-</span> <span class="na">uses</span><span class="pi">:</span> <span class="s">actions/checkout@v2</span>
</code></pre></div></div>
<p>The steps need the PowerShell script to run, so we do need to checkout the project source code first</p>
<div class="language-yaml highlighter-rouge"><div class="highlight"><pre class="highlight"><code> <span class="pi">-</span> <span class="na">id</span><span class="pi">:</span> <span class="s">set-matrix</span>
<span class="na">shell</span><span class="pi">:</span> <span class="s">pwsh</span>
<span class="c1"># Use a small PowerShell script to generate the test matrix</span>
<span class="na">run</span><span class="pi">:</span> <span class="s2">"</span><span class="s">&</span><span class="nv"> </span><span class="s">.github/workflows/create-test-matrix.ps1"</span>
</code></pre></div></div>
<p>The <code class="language-plaintext highlighter-rouge">set-matrix</code> step the runs the <code class="language-plaintext highlighter-rouge">create-test-matrix.ps1</code> PowerShell Script (We’ll get to the script soon). Note that the <code class="language-plaintext highlighter-rouge">id</code> is important here as this is name used in the job output above.</p>
<h3 id="using-the-matrices">Using the matrices</h3>
<p>Let’s look the <code class="language-plaintext highlighter-rouge">run-matrix</code> job:</p>
<div class="language-yaml highlighter-rouge"><div class="highlight"><pre class="highlight"><code> <span class="na">needs</span><span class="pi">:</span> <span class="pi">[</span><span class="nv">matrix</span><span class="pi">]</span>
</code></pre></div></div>
<p>This job (<code class="language-plaintext highlighter-rouge">run-matrix</code>) needs the output of the <code class="language-plaintext highlighter-rouge">matrix</code> job, so this instructs GA to wait until the <code class="language-plaintext highlighter-rouge">matrix</code> job completes.</p>
<div class="language-yaml highlighter-rouge"><div class="highlight"><pre class="highlight"><code> <span class="na">strategy</span><span class="pi">:</span>
<span class="na">fail-fast</span><span class="pi">:</span> <span class="no">false</span>
</code></pre></div></div>
<p>For the purposes of this blog post, I want all of the test cells to run even if other ones have failed. The <a href="https://docs.github.com/en/free-pro-team@latest/actions/reference/workflow-syntax-for-github-actions#jobsjob_idstrategyfail-fast">GA Documentation</a> has more examples of the matrix configuration</p>
<div class="language-yaml highlighter-rouge"><div class="highlight"><pre class="highlight"><code> <span class="na">matrix</span><span class="pi">:</span>
<span class="na">include</span><span class="pi">:</span> <span class="s">${{ fromJson(needs.matrix.outputs.matrix-json) }}</span>
</code></pre></div></div>
<p>This is where the magic happens, where we take the output from matrix job, to configure the matrix for this job. This is similar to the “Sharing matrix information between jobs” post, where it deserialises the JSON string from the <code class="language-plaintext highlighter-rouge">matrix</code> job output parameter called <code class="language-plaintext highlighter-rouge">matrix-json</code>.</p>
<p>Note that I’m using <code class="language-plaintext highlighter-rouge">include</code> instead of <code class="language-plaintext highlighter-rouge">cfg</code> like the post above. This makes it easier to reference matrix items later in the steps. For example, instead of <code class="language-plaintext highlighter-rouge">matrix.cfg.os</code> we can just use <code class="language-plaintext highlighter-rouge">matrix.os</code>.</p>
<div class="language-yaml highlighter-rouge"><div class="highlight"><pre class="highlight"><code> <span class="na">name</span><span class="pi">:</span> <span class="s2">"</span><span class="s">${{</span><span class="nv"> </span><span class="s">matrix.job_name</span><span class="nv"> </span><span class="s">}}"</span>
<span class="na">runs-on</span><span class="pi">:</span> <span class="s">${{ matrix.os }}</span>
<span class="na">steps</span><span class="pi">:</span>
<span class="pi">-</span> <span class="na">uses</span><span class="pi">:</span> <span class="s">actions/checkout@v2</span>
<span class="pi">-</span> <span class="na">name</span><span class="pi">:</span> <span class="s">Run Command</span>
<span class="na">shell</span><span class="pi">:</span> <span class="s">pwsh</span>
<span class="na">run</span><span class="pi">:</span> <span class="pi">|</span>
<span class="s">Write-Host "Run '${{ matrix.command }}'"</span>
</code></pre></div></div>
<p>And then finally the steps for the job.</p>
<ul>
<li><code class="language-plaintext highlighter-rouge">matrix.job_name</code> is used to dynamically change the friendly name of the job</li>
<li><code class="language-plaintext highlighter-rouge">matrix.os</code> is used to change which job runner is used</li>
<li><code class="language-plaintext highlighter-rouge">matrix.command</code> is used to show what PowerShell command could be run</li>
</ul>
<p class="notice--warning">This example won’t actually run any PowerShell commands but you can make GA run a PSake command, or PSBuild command, or run other PowerShell scripts in your project.</p>
<h3 id="creating-matrices-in-powershell">Creating matrices in PowerShell</h3>
<p>Here’s an example script that will output a two cell matrix: One cell for Windows and one for Ubuntu.</p>
<p>This would be saved as <code class="language-plaintext highlighter-rouge">.github/workflows/create-test-matrix.ps1</code>. If you change the name of this script, make sure you also change the GA workflow to use the new name too.</p>
<div class="language-powershell highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="nv">$Jobs</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="p">@()</span><span class="w">
</span><span class="p">@(</span><span class="s1">'ubuntu-latest'</span><span class="p">,</span><span class="w"> </span><span class="s1">'windows-latest'</span><span class="p">)</span><span class="w"> </span><span class="o">|</span><span class="w"> </span><span class="n">ForEach-Object</span><span class="w"> </span><span class="p">{</span><span class="w">
</span><span class="nv">$Jobs</span><span class="w"> </span><span class="o">+=</span><span class="w"> </span><span class="p">@{</span><span class="w">
</span><span class="nx">job_name</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="s2">"Run </span><span class="bp">$_</span><span class="s2"> jobs"</span><span class="w">
</span><span class="nx">os</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="bp">$_</span><span class="w">
</span><span class="nx">command</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="s2">"</span><span class="bp">$_</span><span class="s2"> command"</span><span class="w">
</span><span class="p">}</span><span class="w">
</span><span class="p">}</span><span class="w">
</span><span class="n">Write-Host</span><span class="w"> </span><span class="s2">"::set-output name=matrix::</span><span class="si">$(</span><span class="nv">$Jobs</span><span class="w"> </span><span class="o">|</span><span class="w"> </span><span class="n">ConvertTo-JSON</span><span class="w"> </span><span class="nt">-Compress</span><span class="p">)</span><span class="si">)</span><span class="s2">"</span><span class="w">
</span></code></pre></div></div>
<p>So what’s going on here:</p>
<div class="language-powershell highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="nv">$Jobs</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="p">@()</span><span class="w">
</span></code></pre></div></div>
<p>We store the Job information in the <code class="language-plaintext highlighter-rouge">$Jobs</code> variable. Initially we have no jobs</p>
<div class="language-powershell highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="p">@(</span><span class="s1">'ubuntu-latest'</span><span class="p">,</span><span class="w"> </span><span class="s1">'windows-latest'</span><span class="p">)</span><span class="w"> </span><span class="o">|</span><span class="w"> </span><span class="n">ForEach-Object</span><span class="w"> </span><span class="p">{</span><span class="w">
</span></code></pre></div></div>
<p>Instead of hardcoding, we can use loops and enumeration to create matrix cells</p>
<div class="language-powershell highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="w"> </span><span class="nv">$Jobs</span><span class="w"> </span><span class="o">+=</span><span class="w"> </span><span class="p">@{</span><span class="w">
</span><span class="nx">job_name</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="s2">"Run </span><span class="bp">$_</span><span class="s2"> jobs"</span><span class="w">
</span><span class="nx">os</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="bp">$_</span><span class="w">
</span><span class="nx">command</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="s2">"</span><span class="bp">$_</span><span class="s2"> command"</span><span class="w">
</span><span class="p">}</span><span class="w">
</span></code></pre></div></div>
<p>To create a matrix cell we add a HashTable to the <code class="language-plaintext highlighter-rouge">$Jobs</code> array. Each key in the HashTable appears as a matrix variable in the GA Workflow. In this example we are setting three keys; <code class="language-plaintext highlighter-rouge">job_name</code>, <code class="language-plaintext highlighter-rouge">os</code> and <code class="language-plaintext highlighter-rouge">command</code>. These are then used in the Workflow as <code class="language-plaintext highlighter-rouge">matrix.job_name</code>, <code class="language-plaintext highlighter-rouge">matrix.os</code> and <code class="language-plaintext highlighter-rouge">matrix.command</code> respectively. And each matrix cell does <em>not</em> has to have the same keys. It’s completely up to you to what each matrix cell has.</p>
<div class="language-powershell highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">Write-Host</span><span class="w"> </span><span class="s2">"::set-output name=matrix::</span><span class="si">$(</span><span class="nv">$Jobs</span><span class="w"> </span><span class="o">|</span><span class="w"> </span><span class="n">ConvertTo-JSON</span><span class="w"> </span><span class="nt">-Compress</span><span class="p">)</span><span class="si">)</span><span class="s2">"</span><span class="w">
</span></code></pre></div></div>
<p>And then lastly we use the <code class="language-plaintext highlighter-rouge">set-output</code> magic text and convert the Jobs into a JSON string. Note the use of <code class="language-plaintext highlighter-rouge">-Compress</code> here. GA doesn’t allow line breaks in the output so compress is used to create a JSON string on a single line.</p>
<p>When you run this script you get the following output:</p>
<div class="language-text highlighter-rouge"><div class="highlight"><pre class="highlight"><code>::set-output name=matrix::[{"job_name":"Run ubuntu-latest jobs","command":"ubuntu-latest command","os":"ubuntu-latest"},{"job_name":"Run windows-latest jobs","command":"windows-latest command","os":"windows-latest"}])
</code></pre></div></div>
<p>Which, let’s be honest is hard to read. Let’s add a script parameter called <code class="language-plaintext highlighter-rouge">Raw</code> which will output the JSON in a readable way for humans</p>
<div class="language-powershell highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kr">param</span><span class="p">(</span><span class="w">
</span><span class="p">[</span><span class="n">Switch</span><span class="p">]</span><span class="nv">$Raw</span><span class="p">,</span><span class="w">
</span><span class="p">)</span><span class="w">
</span><span class="nv">$Jobs</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="p">@()</span><span class="w">
</span><span class="p">@(</span><span class="s1">'ubuntu-latest'</span><span class="p">,</span><span class="w"> </span><span class="s1">'windows-latest'</span><span class="p">)</span><span class="w"> </span><span class="o">|</span><span class="w"> </span><span class="n">ForEach-Object</span><span class="w"> </span><span class="p">{</span><span class="w">
</span><span class="nv">$Jobs</span><span class="w"> </span><span class="o">+=</span><span class="w"> </span><span class="p">@{</span><span class="w">
</span><span class="nx">job_name</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="s2">"Run </span><span class="bp">$_</span><span class="s2"> jobs"</span><span class="w">
</span><span class="nx">os</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="bp">$_</span><span class="w">
</span><span class="nx">command</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="s2">"</span><span class="bp">$_</span><span class="s2"> command"</span><span class="w">
</span><span class="p">}</span><span class="w">
</span><span class="p">}</span><span class="w">
</span><span class="kr">if</span><span class="w"> </span><span class="p">(</span><span class="nv">$Raw</span><span class="p">)</span><span class="w"> </span><span class="p">{</span><span class="w">
</span><span class="n">Write-Host</span><span class="w"> </span><span class="p">(</span><span class="nv">$Jobs</span><span class="w"> </span><span class="o">|</span><span class="w"> </span><span class="n">ConvertTo-JSON</span><span class="p">)</span><span class="w">
</span><span class="p">}</span><span class="w"> </span><span class="kr">else</span><span class="w"> </span><span class="p">{</span><span class="w">
</span><span class="c"># Output the result for consumption by GitHub Actions</span><span class="w">
</span><span class="n">Write-Host</span><span class="w"> </span><span class="s2">"::set-output name=matrix::</span><span class="si">$(</span><span class="nv">$Jobs</span><span class="w"> </span><span class="o">|</span><span class="w"> </span><span class="n">ConvertTo-JSON</span><span class="w"> </span><span class="nt">-Compress</span><span class="p">)</span><span class="si">)</span><span class="s2">"</span><span class="w">
</span><span class="p">}</span><span class="w">
</span></code></pre></div></div>
<p>So now running <code class="language-plaintext highlighter-rouge">.github/workflows/create-test-matrix.ps1 -Raw</code></p>
<div class="language-text highlighter-rouge"><div class="highlight"><pre class="highlight"><code>[
{
"job_name": "Run ubuntu-latest jobs",
"command": "ubuntu-latest command",
"os": "ubuntu-latest"
},
{
"job_name": "Run windows-latest jobs",
"command": "windows-latest command",
"os": "windows-latest"
}
]
</code></pre></div></div>
<h2 id="seeing-the-github-action-in-action">Seeing the GitHub Action in action</h2>
<p><img src="https://sarti.dev/blog-images/dynamic-test-matrix-01.gif" alt="GitHub Action Workflow : Animated gif of the workflow running" class="align-center" /></p>
<ul>
<li>
<p>Running the workflow we see first that the <code class="language-plaintext highlighter-rouge">Generate test matrix</code> job is running but there are not yet any subsequent jobs</p>
</li>
<li>
<p>Once the generation job is complete, two new jobs appear. These are the two jobs we specified in our PowerShell script: <code class="language-plaintext highlighter-rouge">Run ubuntu-latest jobs</code> and <code class="language-plaintext highlighter-rouge">Run windows-latest jobs</code></p>
</li>
<li>
<p>When we look at the output of these commands we can see that the Ubuntu job has <code class="language-plaintext highlighter-rouge">Write-Host "Run 'ubuntu-latest command'</code> and the Windows job has <code class="language-plaintext highlighter-rouge">Write-Host "Run 'windows-latest command'</code>, just like we specified in our PowerShell script</p>
</li>
</ul>
<h2 id="back-to-the-original-problem-">Back to the original problem =…</h2>
<p>Back to Puppet Editor Services … Now I could create a <a href="https://github.com/puppetlabs/puppet-editor-services/blob/main/.github/workflows/create-test-matrix.ps1">PowerShell script</a> which created a JSON string with all of the test cases I needed (12 in total), I could easily make out what each cell matrix did, and I could easily add and remove test cases in the future.</p>
<p><img src="https://sarti.dev/blog-images/dynamic-test-matrix-02.png" alt="GitHub Action Workflow : Puppet Editor Services output from main" class="align-center" /></p>
<p>All of the code for this is in <a href="https://github.com/puppetlabs/puppet-editor-services/pull/288">Pull Request 288 of the Puppet Editors Services project</a>.</p>
<h1 id="going-further">Going further</h1>
<p>Now that we are using PowerShell to generate the matrix, it opens up more opportunities:</p>
<p>You could add some tests if the person who raised the Pull Request had the name ‘glennsarti’</p>
<div class="language-powershell highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kr">if</span><span class="w"> </span><span class="p">(</span><span class="nv">$</span><span class="nn">ENV</span><span class="p">:</span><span class="nv">GITHUB_ACTOR</span><span class="w"> </span><span class="o">-eq</span><span class="w"> </span><span class="s1">'glennsarti'</span><span class="p">)</span><span class="w"> </span><span class="p">{</span><span class="w">
</span><span class="nv">$Jobs</span><span class="w"> </span><span class="o">+=</span><span class="w"> </span><span class="p">@{</span><span class="w">
</span><span class="c"># ...</span><span class="w">
</span><span class="p">}</span><span class="w">
</span><span class="p">}</span><span class="w">
</span></code></pre></div></div>
<p class="notice--warning">See the <a href="https://docs.github.com/en/free-pro-team@latest/actions/reference/environment-variables#default-environment-variables">documentation</a> for the full list of GA Environment Variables</p>
<p>Or…</p>
<h2 id="context-aware-testing">Context aware testing</h2>
<p>What if we could detect which files were being changed in a Pull Request and then change the testing. For example:</p>
<p>Let’s say we had a PowerShell module which included documentation. If a Pull Request was <strong>ONLY</strong> changing the documentation files then there’d be no need to run PowerShell script tests. And vice versa.</p>
<p>Fortunately git can help us here. We can use <code class="language-plaintext highlighter-rouge">git diff --name-only</code> to list all of the files that are affected.</p>
<div class="language-powershell highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kr">param</span><span class="p">(</span><span class="w">
</span><span class="p">[</span><span class="n">Switch</span><span class="p">]</span><span class="nv">$Raw</span><span class="p">,</span><span class="w">
</span><span class="p">[</span><span class="n">String</span><span class="p">]</span><span class="nv">$FromRef</span><span class="w">
</span><span class="p">)</span><span class="w">
</span><span class="nv">$Jobs</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="p">@()</span><span class="w">
</span><span class="nv">$TestModule</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="bp">$false</span><span class="w">
</span><span class="nv">$TestDocs</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="bp">$false</span><span class="w">
</span><span class="kr">if</span><span class="w"> </span><span class="p">(</span><span class="o">!</span><span class="p">[</span><span class="n">String</span><span class="p">]::</span><span class="n">IsNullOrWhiteSpace</span><span class="p">(</span><span class="nv">$FromRef</span><span class="p">))</span><span class="w"> </span><span class="p">{</span><span class="w">
</span><span class="p">(</span><span class="o">&</span><span class="w"> </span><span class="n">git</span><span class="w"> </span><span class="nx">diff</span><span class="w"> </span><span class="nt">--name-only</span><span class="w"> </span><span class="nv">$FromRef</span><span class="o">...</span><span class="nf">HEAD</span><span class="p">)</span><span class="w"> </span><span class="o">|</span><span class="w"> </span><span class="n">ForEach-Object</span><span class="w"> </span><span class="p">{</span><span class="w">
</span><span class="kr">if</span><span class="w"> </span><span class="p">(</span><span class="bp">$_</span><span class="w"> </span><span class="o">-like</span><span class="w"> </span><span class="s1">'src/*'</span><span class="p">)</span><span class="w"> </span><span class="p">{</span><span class="w"> </span><span class="nv">$TestModule</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="bp">$true</span><span class="w"> </span><span class="p">}</span><span class="w">
</span><span class="kr">if</span><span class="w"> </span><span class="p">(</span><span class="bp">$_</span><span class="w"> </span><span class="o">-like</span><span class="w"> </span><span class="s1">'docs/*'</span><span class="p">)</span><span class="w"> </span><span class="p">{</span><span class="w"> </span><span class="nv">$TestDocs</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="bp">$true</span><span class="w"> </span><span class="p">}</span><span class="w">
</span><span class="p">}</span><span class="w">
</span><span class="p">}</span><span class="w">
</span><span class="c"># Make sure we test something</span><span class="w">
</span><span class="kr">if</span><span class="w"> </span><span class="p">(</span><span class="o">!</span><span class="nv">$TestModule</span><span class="w"> </span><span class="o">-and</span><span class="w"> </span><span class="o">!</span><span class="nv">$TestDocs</span><span class="p">)</span><span class="w"> </span><span class="p">{</span><span class="w">
</span><span class="nv">$TestModule</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="bp">$true</span><span class="w">
</span><span class="nv">$TestDocs</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="bp">$true</span><span class="w">
</span><span class="p">}</span><span class="w">
</span><span class="p">@(</span><span class="s1">'ubuntu-latest'</span><span class="p">,</span><span class="w"> </span><span class="s1">'windows-latest'</span><span class="p">)</span><span class="w"> </span><span class="o">|</span><span class="w"> </span><span class="n">ForEach-Object</span><span class="w"> </span><span class="p">{</span><span class="w">
</span><span class="kr">if</span><span class="w"> </span><span class="p">(</span><span class="nv">$TestModule</span><span class="p">)</span><span class="w"> </span><span class="p">{</span><span class="w">
</span><span class="nv">$Jobs</span><span class="w"> </span><span class="o">+=</span><span class="w"> </span><span class="p">@{</span><span class="w">
</span><span class="nx">job_name</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="s2">"Test PowerShell Module - </span><span class="bp">$_</span><span class="s2">"</span><span class="w">
</span><span class="nx">os</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="bp">$_</span><span class="w">
</span><span class="nx">command</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="s2">"psake test-powershell"</span><span class="w">
</span><span class="p">}</span><span class="w">
</span><span class="p">}</span><span class="w">
</span><span class="kr">if</span><span class="w"> </span><span class="p">(</span><span class="nv">$TestDocs</span><span class="p">)</span><span class="w"> </span><span class="p">{</span><span class="w">
</span><span class="nv">$Jobs</span><span class="w"> </span><span class="o">+=</span><span class="w"> </span><span class="p">@{</span><span class="w">
</span><span class="nx">job_name</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="s2">"Test Documentation - </span><span class="bp">$_</span><span class="s2">"</span><span class="w">
</span><span class="nx">os</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="bp">$_</span><span class="w">
</span><span class="nx">command</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="s2">"psake test-documentation"</span><span class="w">
</span><span class="p">}</span><span class="w">
</span><span class="p">}</span><span class="w">
</span><span class="p">}</span><span class="w">
</span><span class="kr">if</span><span class="w"> </span><span class="p">(</span><span class="nv">$Raw</span><span class="p">)</span><span class="w"> </span><span class="p">{</span><span class="w">
</span><span class="n">Write-Host</span><span class="w"> </span><span class="p">(</span><span class="nv">$Jobs</span><span class="w"> </span><span class="o">|</span><span class="w"> </span><span class="n">ConvertTo-JSON</span><span class="p">)</span><span class="w">
</span><span class="p">}</span><span class="w"> </span><span class="kr">else</span><span class="w"> </span><span class="p">{</span><span class="w">
</span><span class="c"># Output the result for consumption by GitHub Actions</span><span class="w">
</span><span class="n">Write-Host</span><span class="w"> </span><span class="s2">"::set-output name=matrix::</span><span class="si">$(</span><span class="nv">$Jobs</span><span class="w"> </span><span class="o">|</span><span class="w"> </span><span class="n">ConvertTo-JSON</span><span class="w"> </span><span class="nt">-Compress</span><span class="p">)</span><span class="si">)</span><span class="s2">"</span><span class="w">
</span><span class="p">}</span><span class="w">
</span></code></pre></div></div>
<div class="language-powershell highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="w"> </span><span class="p">[</span><span class="n">String</span><span class="p">]</span><span class="nv">$FromRef</span><span class="w">
</span><span class="p">)</span><span class="w">
</span><span class="nv">$Jobs</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="p">@()</span><span class="w">
</span><span class="nv">$TestModule</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="bp">$false</span><span class="w">
</span><span class="nv">$TestDocs</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="bp">$false</span><span class="w">
</span></code></pre></div></div>
<p>We add a new parameter called <code class="language-plaintext highlighter-rouge">FromRef</code> which specifies where in the git history we compare from. Typically this is the branch the Pull Request is targeted against. We also add two flag variables <code class="language-plaintext highlighter-rouge">TestModule</code> and <code class="language-plaintext highlighter-rouge">TestDocs</code> which we’ll use to track whether we should test the Module and Documentation.</p>
<div class="language-powershell highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="w">
</span><span class="kr">if</span><span class="w"> </span><span class="p">(</span><span class="o">!</span><span class="p">[</span><span class="n">String</span><span class="p">]::</span><span class="n">IsNullOrWhiteSpace</span><span class="p">(</span><span class="nv">$FromRef</span><span class="p">))</span><span class="w"> </span><span class="p">{</span><span class="w">
</span><span class="p">(</span><span class="o">&</span><span class="w"> </span><span class="n">git</span><span class="w"> </span><span class="nx">diff</span><span class="w"> </span><span class="nt">--name-only</span><span class="w"> </span><span class="nv">$FromRef</span><span class="o">...</span><span class="nf">HEAD</span><span class="p">)</span><span class="w"> </span><span class="o">|</span><span class="w"> </span><span class="n">ForEach-Object</span><span class="w"> </span><span class="p">{</span><span class="w">
</span><span class="kr">if</span><span class="w"> </span><span class="p">(</span><span class="bp">$_</span><span class="w"> </span><span class="o">-like</span><span class="w"> </span><span class="s1">'src/*'</span><span class="p">)</span><span class="w"> </span><span class="p">{</span><span class="w"> </span><span class="nv">$TestModule</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="bp">$true</span><span class="w"> </span><span class="p">}</span><span class="w">
</span><span class="kr">if</span><span class="w"> </span><span class="p">(</span><span class="bp">$_</span><span class="w"> </span><span class="o">-like</span><span class="w"> </span><span class="s1">'docs/*'</span><span class="p">)</span><span class="w"> </span><span class="p">{</span><span class="w"> </span><span class="nv">$TestDocs</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="bp">$true</span><span class="w"> </span><span class="p">}</span><span class="w">
</span><span class="p">}</span><span class="w">
</span><span class="p">}</span><span class="w">
</span></code></pre></div></div>
<p>If the <code class="language-plaintext highlighter-rouge">FromRef</code> is set then we run the <code class="language-plaintext highlighter-rouge">git diff</code> command.</p>
<ul>
<li><code class="language-plaintext highlighter-rouge">--name-only</code> means it only returns the filename instead of the full diff for each file</li>
<li><code class="language-plaintext highlighter-rouge">$FromRef...HEAD</code> means to compare from <code class="language-plaintext highlighter-rouge">$FromRef</code>, to the current commit (This is known as <a href="https://git-scm.com/book/en/v2/Git-Internals-Git-References">HEAD</a>)</li>
</ul>
<p>Then for each file that has been changed we test if it’s a PowerShell Module file (<code class="language-plaintext highlighter-rouge">src/*</code>) or a documentation file (<code class="language-plaintext highlighter-rouge">docs/*</code>) and set the appropriate flag (TestModule or TestDocs)</p>
<div class="language-powershell highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="c"># Make sure we test something</span><span class="w">
</span><span class="kr">if</span><span class="w"> </span><span class="p">(</span><span class="o">!</span><span class="nv">$TestModule</span><span class="w"> </span><span class="o">-and</span><span class="w"> </span><span class="o">!</span><span class="nv">$TestDocs</span><span class="p">)</span><span class="w"> </span><span class="p">{</span><span class="w">
</span><span class="nv">$TestModule</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="bp">$true</span><span class="w">
</span><span class="nv">$TestDocs</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="bp">$true</span><span class="w">
</span><span class="p">}</span><span class="w">
</span></code></pre></div></div>
<p>It is possible that a Pull Request doesn’t change either the Module or Documentation so test everything just in case.</p>
<div class="language-powershell highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="w"> </span><span class="kr">if</span><span class="w"> </span><span class="p">(</span><span class="nv">$TestModule</span><span class="p">)</span><span class="w"> </span><span class="p">{</span><span class="w">
</span><span class="nv">$Jobs</span><span class="w"> </span><span class="o">+=</span><span class="w"> </span><span class="p">@{</span><span class="w">
</span><span class="nx">job_name</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="s2">"Test PowerShell Module - </span><span class="bp">$_</span><span class="s2">"</span><span class="w">
</span><span class="nx">os</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="bp">$_</span><span class="w">
</span><span class="nx">command</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="s2">"psake test-powershell"</span><span class="w">
</span><span class="p">}</span><span class="w">
</span><span class="p">}</span><span class="w">
</span><span class="kr">if</span><span class="w"> </span><span class="p">(</span><span class="nv">$TestDocs</span><span class="p">)</span><span class="w"> </span><span class="p">{</span><span class="w">
</span><span class="nv">$Jobs</span><span class="w"> </span><span class="o">+=</span><span class="w"> </span><span class="p">@{</span><span class="w">
</span><span class="nx">job_name</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="s2">"Test Documentation - </span><span class="bp">$_</span><span class="s2">"</span><span class="w">
</span><span class="nx">os</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="bp">$_</span><span class="w">
</span><span class="nx">command</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="s2">"psake test-documentation"</span><span class="w">
</span><span class="p">}</span><span class="w">
</span><span class="p">}</span><span class="w">
</span></code></pre></div></div>
<p>And now only add the PowerShell Module and Documentation testing if the appropriate flag is set.</p>
<p>The last change is to the GA Workflow.</p>
<p>Previously we called the PowerShell script using</p>
<div class="language-yaml highlighter-rouge"><div class="highlight"><pre class="highlight"><code> <span class="na">run</span><span class="pi">:</span> <span class="s2">"</span><span class="s">&</span><span class="nv"> </span><span class="s">.github/workflows/create-test-matrix.ps1"</span>
</code></pre></div></div>
<p>and instead now we can pass through the <code class="language-plaintext highlighter-rouge">-FromRef</code> argument</p>
<div class="language-yaml highlighter-rouge"><div class="highlight"><pre class="highlight"><code> <span class="na">run</span><span class="pi">:</span> <span class="s2">"</span><span class="s">&</span><span class="nv"> </span><span class="s">.github/workflows/create-test-matrix.ps1</span><span class="nv"> </span><span class="s">-FromRef</span><span class="nv"> </span><span class="s">'${{</span><span class="nv"> </span><span class="s">github.base_ref</span><span class="nv"> </span><span class="s">}}'"</span>
</code></pre></div></div>
<p>The <code class="language-plaintext highlighter-rouge">github.base_ref</code> variable comes the GitHub Actions context syntax</p>
<blockquote>
<p>The base_ref or target branch of the pull request in a workflow run. This property is only available when the event that triggers a workflow run is a pull_request.</p>
</blockquote>
<p><a href="https://docs.github.com/en/free-pro-team@latest/actions/reference/context-and-expression-syntax-for-github-actions#github-context">Context Documentation</a></p>
<h1 id="wrapping-up">Wrapping Up</h1>
<p>I migrated from Travis and AppVeyor CI using a custom PowerShell matrix creation script which now gives me more power to cater for different testing scenarios. And we also saw what else you could achieve with this technique; using it to change testing based on who made the change, or changing the testing requirements based on what was changed.</p>Glenn SartiCreating GitHub Action test matrices on the flyPresentation - Beyond Pester 1012020-10-15T00:00:00+08:002020-10-15T00:00:00+08:00https://sarti.dev/presentation/powershell-global-virtual-pester<h3 id="powershell--devops-global-conference-2020---online-edition"><a href="https://powershell.org/24hour/">PowerShell & DevOps Global Conference 2020 - Online Edition</a></h3>
<p>We see a lot talks on testing PowerShell with Pester, but are the tests we write good tests? What makes a test “good”? How do we measure how effective our tests are? This talk will help you answer these questions, including why testing is important and how to apply these principles to your project.</p>
<hr />
<p><a href="https://powershell.org/24hour/">Event details</a></p>
<p><a href="https://speakerdeck.com/glennsarti/ps-virtual-global-20-beyond-pester-101-applying-testing-principles-to-powershell">Presentation</a></p>
<p><a href="https://www.youtube.com/watch?v=faYw8D2vM18&list=PLfeA8kIs7CodaH2nEt0bWxV5Yjcq3bcmr&index=12">Recording</a></p>
<h2 id="resources">Resources</h2>
<p>Types of software testing</p>
<p><a href="https://www.guru99.com/types-of-software-testing.html">https://www.guru99.com/types-of-software-testing.html</a></p>
<p>Pester book</p>
<p><a href="https://leanpub.com/pesterbook">https://leanpub.com/pesterbook</a></p>
<p>Images</p>
<p><a href="https://unsplash.com">https://unsplash.com</a></p>
<p>Dunning Kruger Effect</p>
<p><a href="https://www.xonitek.com/lessons-from-mt-stupid/">https://www.xonitek.com/lessons-from-mt-stupid/</a></p>
<p>Test Pyramid – Martin Fowler</p>
<p><a href="https://martinfowler.com/bliki/TestPyramid.html">https://martinfowler.com/bliki/TestPyramid.html</a></p>
<p>Arrange, Act, Assert</p>
<p><a href="http://wiki.c2.com/?ArrangeActAssert">http://wiki.c2.com/?ArrangeActAssert</a></p>
<p>Working effectively with Legacy Code</p>
<p><a href="https://www.amazon.com/Working-Effectively-Legacy-Michael-Feathers/dp/0131177052">https://www.amazon.com/Working-Effectively-Legacy-Michael-Feathers/dp/0131177052</a></p>
<p>Parallel Pester</p>
<p><a href="https://github.com/glennsarti/ParallelPester/tree/feature-parallel-mode">https://github.com/glennsarti/ParallelPester/tree/feature-parallel-mode</a></p>
<p>PoshBot</p>
<p><a href="https://github.com/poshbotio/PoshBot">https://github.com/poshbotio/PoshBot</a></p>
<p>The Many Forms of Scripting: which to use – Manning</p>
<p><a href="http://freecontent.manning.com/the-many-forms-of-scripting-which-to-use/">http://freecontent.manning.com/the-many-forms-of-scripting-which-to-use/</a></p>Glenn SartiBeyond Pester 101: Applying testing principles to PowerShellThe ‘D’ Word2020-08-31T00:00:00+08:002020-08-31T00:00:00+08:00https://sarti.dev/blog/the-d-word<p>So it all started a a while ago with a tweet from <a href="https://twitter.com/gaelcolas">Gael</a></p>
<blockquote class="twitter-tweet"><p lang="en" dir="ltr">Saying they aren't developers 😏</p>— gael (@gaelcolas) <a href="https://twitter.com/gaelcolas/status/1101875515852386304?ref_src=twsrc%5Etfw">March 2, 2019</a></blockquote>
<script async="" src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
<p>And I responded with</p>
<blockquote class="twitter-tweet"><p lang="en" dir="ltr">Oh look a hill I will die on! You are devs and this is a good thing! <a href="https://t.co/HonV7PlRLJ">https://t.co/HonV7PlRLJ</a></p>— Glenn Sarti (@GlennSarti) <a href="https://twitter.com/GlennSarti/status/1102550296587194368?ref_src=twsrc%5Etfw">March 4, 2019</a></blockquote>
<script async="" src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
<p>And apparently I hit a nerve … This wasn’t the first time I’ve had this reaction. But both <a href="https://twitter.com/MichaelBender">Michael Bender</a> and <a href="https://twitter.com/SoniaCuff">Sonia Cuff</a> disagreed, which gave me pause. I regard Michael and Sonia as progressives in the Windows IT Pro space so why would they disagree with what seemed like an obvious, future looking opinion?</p>
<p>Actually, for those that don’t know me here’s some quick background. People have seen my talks, and chatted in person and on Slack, and they think I’m a Developer but I actually started my Software Development career at age 40. Prior to that I’ve been doing Windows Client Deployment Engineering (SOE) for 20 years (Now I feel old!). I still identify as an IT Pro, who is also a Developer.</p>
<p>So why wouldn’t IT Pros want to identify as developers? There’s nothing wrong with them. They’re perfectly normal human beings. I mean sure, just like any group of people there are some that are fantastic, and there are some that are … downright horrible. But I’m sure Devs say the same thing about IT Pros.</p>
<h3 id="bad-apples">Bad apples?</h3>
<p>So perhaps that’s it? People in support roles, like IT Pros, tend to only see the bad things that happen. I know when I was fixing service requests and responding to incidents from developers I never once had a ticket saying “Thanks for doing a great job!”.</p>
<p>If all you see are complaints from Developers I can certainly understand why people wouldn’t want to associate themselves as one.</p>
<h3 id="what-is-a-developer-anyway">What is a developer anyway?</h3>
<p>But what even is a “Developer” anyway? Just like in the IT Pro space, there are many types of developers. Application Developers, Frontend Developers, Backend Developers. We sometimes lump in Designers and Testers into the “Developer” bucket too. Let alone the rise of the Infrastructure Developer, or SRE/DevOps Engineers (Don’t get me started on those labels!) which bridge both disciplines.</p>
<p>But people have different views of what a developer is. Let’s say you think a developer is someone that makes mobile apps, then sure, of course an IT Pro will say “I’m not a Developer”. So I can also understand that an IT Pro won’t identify as a develper, given <em>their</em> definition of what a developer is.</p>
<h3 id="someone-told-me-im-not-a-developer">Someone told me I’m not a developer</h3>
<blockquote class="twitter-tweet"><p lang="en" dir="ltr">Gatekeeping. <a href="https://t.co/1x5lnOrwPy">pic.twitter.com/1x5lnOrwPy</a></p>— Tyler Leonhardt 🔌🐚 (@TylerLeonhardt) <a href="https://twitter.com/TylerLeonhardt/status/1290897392338796544?ref_src=twsrc%5Etfw">August 5, 2020</a></blockquote>
<script async="" src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
<p>I often see comments like “PowerShell isn’t a <em>real</em> programming language, therefore you’re not a <em>real</em> developer” or “PowerShell is <em>just</em> a scripting language” or other variants. We see this kind of gatekeeping crop up, and it’s not restricted to IT Pros vs Developers. This behaviour is just … unkind and unnecessary.</p>
<p>And if IT Pros are told often enough that “they’re not <em>really</em> developers”, I can certainly understand why they wouldn’t identify as one!</p>
<h2 id="labels-and-titles-are-a-projection-but-not-definitive">Labels and titles are a projection, but not definitive</h2>
<p>It’s been over a year since that original tweet from Gael and something Michael Bender said later (I wish I could find the tweet!) struck home. To paraphrase:</p>
<blockquote>
<p>If I’m an IT Pro <strong>and</strong> a Developer, then I’m also an Architect, a Tech Writer and more</p>
</blockquote>
<p>Why was this so profound to me? Michael really was all those things and he was right, he would have to identify as an Architect etc.. More importantly though, why did he <em>choose</em> to not identify as a developer. Because the labels we <strong>choose</strong> for ourselves are important to <strong>us</strong>.</p>
<p>Michael chose to identify as an IT Pro and not a Developer, and I should respect that decision. Who am I to judge what a Developer is or is not? But that does not mean Michael doesn’t or can’t do Developer-like or Architect-like tasks and functions. The label does not define what Michael can and can’t do.</p>
<blockquote class="twitter-tweet"><p lang="en" dir="ltr">How about we just get rid of the labels? It's not inclusive and it creates silos because of the stigma & years of baggage each one carries. We are working towards a world where the lines are blurred. Yet we still maintain the titles & mindsets that built the lines.</p>— Michael Bender @ Home (@MichaelBender) <a href="https://twitter.com/MichaelBender/status/1035524938722144256?ref_src=twsrc%5Etfw">August 31, 2018</a></blockquote>
<script async="" src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
<p>The labels are just a projection. They’re visible, but not concrete.</p>
<h2 id="conclusions">Conclusions</h2>
<p>So do I regret my initial tweet?, and I guess the answer is yes. Perhaps I should’ve tweeted something like:</p>
<blockquote>
<p>IT Pros or users of PowerShell, don’t have to be Developers, but they can!</p>
</blockquote>
<p>And this is what gets me excited about the future of the “IT Pro”.</p>
<blockquote>
<p>But admins aren’t devs, whether those admins are coding or not. Administrative coding is very distinct from application development, although they obviously have big areas of intersection and overlap.</p>
<p>— Don Jones - https://donjones.com/2018/08/30/ama-a-plethora-of-career-advice/</p>
</blockquote>
<p>IT Pros can <em>learn</em> from the good Software Developers. We can take the practices that they’ve tried and tested, and apply them to help <em>us</em> do <em>our</em> jobs better.</p>
<p>Even Sonia is learning git, which many think of as a developer only tool. Just today I was teaching network engineers, to use git and PowerShell to manage and deploy Azure Network Security Groups.</p>
<blockquote class="twitter-tweet"><p lang="en" dir="ltr">Ok I may be getting hooked on the satisfaction of committing a good day's work and sending the pull request link to my US project colleague.</p>— Sonia Cuff (@SoniaCuff) <a href="https://twitter.com/SoniaCuff/status/1294181137296703488?ref_src=twsrc%5Etfw">August 14, 2020</a></blockquote>
<script async="" src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
<p>You can be an IT Pro, who can <em>also</em> do developer things!</p>Glenn SartiOps are Devs - or are they?Azure NSG Rule Shenanigans2020-08-17T00:00:00+08:002020-08-17T00:00:00+08:00https://sarti.dev/blog/azure-nsg-rules<h2 id="what-are-nsg-rules">What are NSG Rules?</h2>
<p>Azure provides firewall type features using <a href="https://docs.microsoft.com/en-us/azure/virtual-network/security-overview">Network Security Groups</a> (NSG). So you can specify:</p>
<blockquote>
<p>Allow TCP Network traffic from Computer A to Port 443 on Computer B</p>
</blockquote>
<p>or</p>
<blockquote>
<p>Deny all traffic</p>
</blockquote>
<p>I was at a client one day and a colleague asked for my help on a strange problem …</p>
<h2 id="infrastructure-setup">Infrastructure Setup</h2>
<p>He had a client application that connected to an <a href="https://www.f5.com/products/big-ip-services/virtual-editions">F5 Service</a> and that then connected to a Windows server in a virtual pool. On the, Windows Server was an NSG applied to the Network Interface. The application connected over TCP Port 9000. So something like this</p>
<div class="language-text highlighter-rouge"><div class="highlight"><pre class="highlight"><code> +---+
+--------+ +--------+ | N | +----------------+
| Client | ---TCP 9000---> | F5 | ---TCP 9000---> | S |--| Windows Server |
+--------+ +--------+ | G | +----------------+
+---+
</code></pre></div></div>
<p>But here was the strange thing … When he applied a rule which allowed all traffic from the F5 to the Windows Server, the application worked correctly. But when he applied a rule which allowed traffic from the F5 to the Windows Server to port 9000 the application didn’t work. It didn’t make sense! The traffic was allowed. Even the network tracing logs showed the required traffic was on port 9000. So why did adding port 9000 break the application?</p>
<h2 id="what-went-wrong">What went wrong?</h2>
<p>Looking at the F5 Virtual Pool information, it showed that the pool was unavailable. But the server was clearly available. So why did the F5 think the server wasn’t functioning? The virtual pool was using ICMP pings to check if a server was functioning. So the network looked more like this now:</p>
<div class="language-text highlighter-rouge"><div class="highlight"><pre class="highlight"><code> +---+
+--------+ +--------+ ---TCP 9000---> | N | +----------------+
| Client | ---TCP 9000---> | F5 | | S |--| Windows Server |
+--------+ +--------+ --- ICMP ---- | G | +----------------+
+---+
</code></pre></div></div>
<p>The NSG rules should’ve been allowing all of that traffic, but obviously it was not. We added an additional rule which allowed ICMP from the F5 to the Windows Server, and almost immediately the F5 could ping the server, the virtual pool was available and the appliction worked! Great news, but why did we need the extra rule?</p>
<h2 id="when-any-doesnt-mean-any">When ANY doesn’t mean ANY</h2>
<p>Looking back to the NSG rule we used, it made sense why it broke, but it was certainly not obvious.</p>
<p>The original rule <code class="language-plaintext highlighter-rouge">Allow ANY protocol FROM F5 to WINDOWS SERVER</code> would allow TCP,ICMP and UDP (The protocols the NSG understands) from the F5 to the Windows Server.</p>
<p>The new rule <code class="language-plaintext highlighter-rouge">Allow ANY protocol FROM F5 to WINDOWS SERVER on Port 9000</code> was … interesting. The first part <code class="language-plaintext highlighter-rouge">ANY protocol</code> would allow TCP, UDP and ICMP. The last part <code class="language-plaintext highlighter-rouge">Port 9000</code> would allow traffic to UDP and TCP Port 9000. But what about ICMP? Well ICMP doesn’t have ports so it can’t really ever equal 9000. So the rule was a little confusing. But it turns out any protocol doesn’t always <em>mean</em> any.</p>
<h3 id="never-forget-icmp">Never forget ICMP</h3>
<p>So when adding ports to an NSG rule, remember that it won’t match the ICMP protocol. For allow or deny rules. And if you need that for logging or health checks, remember to add it in!</p>Glenn SartiStrange things afoot with Azure Network Security Group RulesNew feature for VSCode Remote Containers2020-07-21T00:00:00+08:002020-07-21T00:00:00+08:00https://sarti.dev/blog/vscode-containers<h2 id="i-was-about-to-">I was about to …</h2>
<p>So I was about to write a post about a book I had read recently. I fired up Docker Desktop and VSCode to start writing and then I saw a new option:</p>
<p><img src="https://sarti.dev/blog-images/vscode-clone-in-volume.png" alt="Clone in Volume screenshot" class="align-center" /></p>
<p>After much digging (Far more than I should’ve to be honest!) I found this small sentence in the <a href="https://code.visualstudio.com/updates/v1_47#_remote-development">VSCode 1.47 release notes</a></p>
<blockquote>
<p>Remote - Containers: Prompt to open repository in a volume.</p>
</blockquote>
<p>After more digging (Really? It shouldn’t be this hard to find this stuff…) I found the more <a href="https://github.com/microsoft/vscode-docs/blob/master/remote-release-notes/v1_47.md#containers-version-01280">detailed release notes</a></p>
<blockquote>
<p><em>Guidance to open a repository in a volume</em></p>
<p>When opening a Git repository folder, the Reopen in Container notification now offers to clone and reopen the repository in a Docker volume. Using a Docker volume has better disk performance because it uses a Linux filesystem without any extra layer between the Linux Kernel and the filesystem. (We do not show this guidance on Linux, but the feature is still available using the Remote-Containers: Open Repository in Container command.)</p>
</blockquote>
<h2 id="so-what">So what?</h2>
<p>Now some people may be thinking “So what?”, but this is a Big Deal™ . I had recently stopped using WSL2 as the Docker Backend as it had a <a href="https://github.com/microsoft/WSL/issues/4169">major drawback with file system watchers</a>: The first being VSCode’s own file watcher so it can automatically refresh git status, and the second being for Jekyll to automatically rebuild my blog while I wrote.</p>
<p>While I put in a workaround for Jekyll (Force to use polling instead of a watcher) there was no way I could get around the VSCode Git problem.</p>
<blockquote class="twitter-tweet"><p lang="en" dir="ltr">Hrmm, I'm pretty sure I'm hitting a variant of <a href="https://t.co/I5hK87jYqT">https://t.co/I5hK87jYqT</a><br /><br />File Watchers in Windows File space don't appear to be working...</p>— Glenn Sarti (@GlennSarti) <a href="https://twitter.com/GlennSarti/status/1276786181334683648?ref_src=twsrc%5Etfw">June 27, 2020</a></blockquote>
<script async="" src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
<hr />
<p>So what does this new feature do? Well previously the Remote Container extension would mount my Windows project directory into a Linux Container, which meant inside the container it was using the 9p filesystem which has the issues …</p>
<p>But now, it looks like VSCode will clone a copy of my project INTO the container, which means it’s on the “native” container filesystem so my issues have all gone! Sure, I’m still subject to WSL2 performance issues, but it’s so much better than having a full Hyper-V VM. This also means it’s a <em>copy</em> of my local project, so even though it <em>looks</em> like my local project, you can’t just add and remove files like you could previously.</p>
<h2 id="thankyou-">Thankyou 🙏🙏</h2>
<p>So thanks VSCode Team! This is a great improvement for me!</p>Glenn SartiOpening VSCode Projects as a volume in VSCodeHello? Is it me you’re looking for?2020-06-26T00:00:00+08:002020-06-26T00:00:00+08:00https://sarti.dev/blog/hello-is-it-me<p>So, it’s been a while since my last real blog post, nearly a year and a half!
Where have I been? Well, quite a lot has happened since January 2019.
I’m no longer working at Puppet, and I’m now a Senior Consultant with <a href="https://purple.telstra.com.au">Telstra Purple</a>
But first I want to talk about how I ended up at Purple.</p>
<h2 id="a-story-of-discovery-">A story of discovery …</h2>
<p>I had been going through a crisis-of-purpose for over a year.
I really enjoyed working with my team members at Puppet.
Even now I still keep in contact with them <em>but</em> something was missing.</p>
<p>I posed this question to a couple of mentors. Puppet is a good company, the people are nice, my career had never been as vibrant as the last three years and yet what was wrong?
And after thinking and searching, and quite frankly a butt-load of time, it came down to many things, but a few felt more important than others:</p>
<p><strong>My technical skills were increasing in areas that aren’t really in demand</strong></p>
<p>My skills in Ruby were getting better and better. But there was a big problem. Even a simple search on job-seeker sites showed that very few people are looking for ruby programmers, particularly ones with no Ruby on Rails experience. But then I looked at what I considered my technical strengths: <code class="language-plaintext highlighter-rouge">A Windows first infrastructure programmer, proficient in PowerShell and Ruby. In Open Source development environments.</code></p>
<p>There wasn’t much (to be honest, ANY) demand for this whole combination. Which lead me to a second realisation</p>
<p><strong>My technical skills were decreasing in areas that are in demand</strong></p>
<p>To be frank, my skills in cloud based infrastructure were pretty much non-existant. I did try to fix that in my time at Puppet, but for what-ever reasons which I won’t go into, I never managed to get roles which involved more of the Cloud. And the longer I stayed in my current role, the further behind I would be in that space. And this was confirmed emphatically by a failed job interview:</p>
<p>Purple wasn’t the first company I had tried to get a job at. <a href="https://www.hashicorp.com/">HashiCorp</a> was hiring a Developer Advocate for the first time in the Asia Pacific (APAC) Region. I was super excited about this opportunity as I had the chance to briefly meet the CEO (<a href="https://twitter.com/mitchellh">Mitchell Hashimoto</a>) when he was here in Perth. I got through the first, second, third and even fourth round of interviews; only to be told no. Now, to HashiCorp’s credit they agreed to do a quick meeting with me as to why I didn’t succeed (Thankyou Rachel Shaw!). They thought I was a great candidate and loved my teaching style but … my lack of cloud skills stopped them saying Yes.</p>
<p>Clear evidence that my feelings were valid and should be a concern.</p>
<p><strong>Working remote, with bad timezones, was probably taking a toll on me</strong></p>
<p>I don’t consider myself an extrovert, but even introverts need social connection. I had been working remotely with really bad timezone overlaps for years now. And while I don’t think I was in a bad headspace, I was rarely in a great one. This was probably exacerbated but my crisis-of-purpose but still, it couldn’t be ignored. A more local timezone was required.</p>
<h2 id="taking-action">Taking action</h2>
<p>So with that I went on the job search. I was being very particular about which companies to apply for. Because Puppet was still a good place to be. I could still make a difference to the lives of thousands of Puppet users with the work I was doing. So I had the benefit of time.</p>
<p>I knew a few people who worked at Telstra Purple (nee Readify) both from previous jobs and from various conferences. I respected these people and looked up to them (One of them is even a mentor of mine!) and looked like a great opportunity for me to spread my infrastructure wings and find the purpose I was looking for!</p>
<p>So after two months of informal chats, interviews and so on, I signed the contract, accepting a job offer at Telstra Purple as a Senior Consultant 🎉🎉</p>
<h2 id="and-then-">And then …</h2>
<p>And between me signing the contract and actually starting the new job, COVID19 hit hard. I was to be onboarding to a new company, a new job, from the relative comfort of my home office.</p>
<p>Oh well. While I’m still lacking social connection, my timezones have improved VASTLY!</p>
<p>But a tip for new job seekers: DON’T ONBOARD DURING A PANDEMIC.</p>
<blockquote>
<p>⭐ 1 out of 5 stars</p>
<p>Would not do again!</p>
</blockquote>
<h2 id="what-about-the-puppet-vs-code-extension">What about the Puppet VS Code extension?</h2>
<p>I’m still actively involved in it, just not as much as I once was! I even have a 1.0 release coming up!</p>
<hr />
<p>So what about this blog then? I want to start blogging again. Something I had forgotten about blogging is that’s a great way to cement my thoughts. And I have MANY of them going around in this head of mine. The other thing I forgot is blogs don’t have to be long. So look forward to more bitesize articles from me in the future!</p>Glenn SartiWhere have I been?Presentation - Sharing: What’s in it for me!?2019-10-08T00:00:00+08:002019-10-08T00:00:00+08:00https://sarti.dev/presentation/sharing-whats-in-it-for-me-dodsyd<h3 id="sharing-whats-in-it-for-me"><a href="https://www.devopsdayssydney.org/agenda/#session-100">Sharing: What’s in it for me!?</a></h3>
<hr />
<p>Sharing. It’s one of the four pillars of CAMS. We consume so much shared content but have we ever thought about sharing from the point of view of the Sharer? Why should I share? If I’m constantly sharing information what do I get out of it? What’s in it for me!?</p>
<p>This ignite talk looks at sharing through the eyes of a Sharer. What benefits do they receive out of sharing? What cultural or psychological benefits are there?</p>
<p><a href="https://speakerdeck.com/glennsarti/sharing-whats-in-it-for-me-devopsdays-syd-2019">Presentation</a></p>
<p>Recording - Not recorded</p>
<h3 id="transcript">Transcript</h3>
<hr />
<p>CAMS. The original four pillars of DevOps: Culture, Automation, Measurement and Sharing.
And yet, I went back through all of the previous DevOpsDays talks, less than 1 percent were about Sharing!
So today I want to help even that score by appealing to your selfish side and answer the question: “Sharing. What’s in it for me?”</p>
<p>And, strangely enough, the most selfish way to share, is not to share with anyone. Instead share with a Rubber Duck!</p>
<p>Rubber ducking is a technique that as you share a problem or idea with the Duck, it helps the solution become clearer.
You start talking through a complex problem and mid sentence, <em>bing</em> a light bulb goes off, and the answer just appears!</p>
<p>There are many thoughts about why this works but here’s two:</p>
<ul>
<li>
<p>The first, Your mouth is slower than your brain. When you are speaking to your duck, it forces your brain to slow down and process information in depth.</p>
</li>
<li>
<p>The second, you need to take into account what the Duck knows.
It forces you to think from another point of view and then see other solutions.</p>
</li>
</ul>
<p>By taking the time to explain your problem to a duck and provide the context that the duck needs to understand, you are telling a story, and story telling is a very powerful tool.
We are wired to listen and to tell stories, and there’s one story that is the most important to all of us. It’s our own story - This is autobiographical storytelling.</p>
<p>Dr. Sherry Hamby and her team has researched the power of emotional, autobiographical storytelling</p>
<blockquote>
<p>… Research shows that even brief exercises can have substantial impacts on psychological and physical health even months after</p>
</blockquote>
<p>She found four benefits</p>
<ol>
<li>
<p>You will find your <strong>own</strong> voice when you write your story, because stories have structure. They have a beginning, middle and end.</p>
</li>
<li>
<p>And the <strong>act</strong> of writing your story will clarify what’s important to you. It re-affirms your values</p>
</li>
<li>
<p>When you share your story, you realise your words can be a positive power on other people</p>
</li>
<li>
<p>And all of this builds a sense of well-being, which then builds resilience for when adversity next strikes.</p>
</li>
</ol>
<p>Being resilient to adversity is important for our mental health.</p>
<p>Dr. Hamby says:</p>
<blockquote>
<p>Resilience is strengthened by recognizing that we are all experts in our <em>own</em> lives and we <em>all</em> have something to share with others.</p>
</blockquote>
<p>So how do you start writing stories then? It’s quite possible you’re already doing this.
If you have a blog, I’m sure you’re writing stories <em>you</em> want to tell, that <em>matter</em> to you, about problems <em>you</em> overcame. Or an incident post-mortem, or narrative based documentation, or answering questions on Stack Overflow. These are all forms of stories.
And this is also backed up by data …</p>
<p>Doctor Nicole Forsgren found that when leaders give their teams autonomy in their work, it leads to feelings of trust and voice, which then positively influences organizational culture.
This is the same Voice that is found during autobiographical storytelling.</p>
<p>Or perhaps use your Voice and become a mentor …
Kris Howard talked about the many benefits of sharing your story:
You can build your confidence, perspective, communication and leadership skills. Create a sense of adventure and just plain connect with other humans!</p>
<p>Now… as much as I’m painting this whole sharing thing as rainbows and unicorns, there’s a darker side here which we have to talk about.</p>
<p>We all have the ability to share, <strong>but it’s not safe</strong> for all of us to do so</p>
<p>Daily… we see occurrences where someone shares something and they are harassed, bullied and threatened.
People are sharing their stories, their thoughts and ideas; and never get to reap any of the benefits, because it’s not safe to do so.</p>
<p>Here in Australia, 91% of employees believe that mental health is important and yet <strong>ONLY 52%</strong> believe that their workplace is mentally healthy.
1 in 5 people take time off work due to mental unhealth.</p>
<p>And mental heath and psychological safety isn’t just important for own well being. It’s a predictor of high performing teams.
Not your technical achievements, or org structure, or mission statement, but team dynamics and safety.</p>
<p>And so, if you have a voice in your community, if you enjoy the privilege of being able to share your stories without retribution, then you have an obligation to afford others the same - Give others the space, and most importantly, the safety to share their stories too.</p>
<p>So to answer: Sharing, what’s in it for me? Solving my own problems, thinking from other people’s perspectives, increasing my resilience and mental health!
Sounds great doesn’t it!!!</p>
<p>So as we move into Open Spaces and maybe start sharing our own stories, I’d like to leave you a final thought from the poet James Russell Lowell,</p>
<blockquote>
<p>Not what we give, but what we share,</p>
<p>For the gift without the giver is bare;</p>
<p>Who gives them self with their alms feeds three,</p>
<p>Them self, their hungering neighbor, and IT.</p>
</blockquote>
<p>Thankyou</p>
<hr />
<h3 id="references-and-links">References and Links</h3>
<h4 id="patrick-debois">Patrick Debois</h4>
<ul>
<li>
<p><a href="https://jaxenter.com/s-is-for-sharing-tells-the-father-of-devops-patrick-debois-105077.html">https://jaxenter.com/s-is-for-sharing-tells-the-father-of-devops-patrick-debois-105077.html</a></p>
</li>
<li>
<p><a href="https://www.youtube.com/watch?v=j2EVXvqwKwY">https://www.youtube.com/watch?v=j2EVXvqwKwY</a></p>
</li>
</ul>
<h4 id="rubber-ducking">Rubber Ducking</h4>
<ul>
<li><a href="https://en.wikipedia.org/wiki/Rubber_duck_debugging">https://en.wikipedia.org/wiki/Rubber_duck_debugging</a></li>
</ul>
<h4 id="dr-sherry-hamby">Dr. Sherry Hamby</h4>
<ul>
<li>
<p><a href="https://www.sherryhamby.com/">https://www.sherryhamby.com/</a></p>
</li>
<li>
<p><a href="https://www.psychologytoday.com/blog/the-web-violence/201309/resilience-and-4-benefits-sharing-your-story">https://www.psychologytoday.com/blog/the-web-violence/201309/resilience-and-4-benefits-sharing-your-story</a></p>
</li>
<li>
<p><a href="https://www.lifepathsresearch.org/">https://www.lifepathsresearch.org/</a></p>
</li>
</ul>
<h4 id="dr-nicole-forsgren">Dr. Nicole Forsgren</h4>
<ul>
<li>
<p><a href="https://nicolefv.com/">https://nicolefv.com/</a></p>
</li>
<li>
<p><a href="https://devops-research.com/">https://devops-research.com/</a></p>
</li>
<li>
<p><a href="https://devops-research.com/2018/08/announcing-accelerate-state-of-devops-2018/">https://devops-research.com/2018/08/announcing-accelerate-state-of-devops-2018/</a></p>
</li>
</ul>
<h4 id="kris-howard">Kris Howard</h4>
<ul>
<li>
<p><a href="https://www.krishoward.org/">https://www.krishoward.org/</a></p>
</li>
<li>
<p><a href="https://www.youtube.com/watch?v=zMReShyytQA">https://www.youtube.com/watch?v=zMReShyytQA</a></p>
</li>
</ul>
<h4 id="state-of-workplace-mental-health-in-australia">State of Workplace Mental Health in Australia</h4>
<ul>
<li><a href="https://www.headsup.org.au/docs/default-source/resources/bl1270-report---tns-the-state-of-mental-health-in-australian-workplaces-hr.pdf?sfvrsn=94e47a4d_8">https://www.headsup.org.au/docs/default-source/resources/bl1270-report—tns-the-state-of-mental-health-in-australian-workplaces-hr.pdf?sfvrsn=94e47a4d_8</a></li>
</ul>Glenn Sarti(DevOpsDays Sydney 2019) This ignite talk looks at sharing through the eyes of a Sharer. What benefits do they receive out of sharing? What cultural or psychological benefits are there?Presentation - How to become a SHiPS wright2019-04-27T00:00:00+08:002019-04-27T00:00:00+08:00https://sarti.dev/presentation/powershell-summit2019-ships<h3 id="powershell-summit-north-america-2019"><a href="https://powershell.org/summit/">PowerShell Summit North America 2019</a></h3>
<p>A Shipwright an artisan skilled in one or more of the tasks required to build vessels. A SHiPSwright is an artisan skilled in one or more of the tasks required to build PowerShell Providers. The SHiPS toolkit has been around for a while but it can be a little difficult to get started.</p>
<hr />
<p><a href="https://app.socio.events/MjQ4Nw/agenda/14445/session/61497">Event details</a></p>
<p><a href="https://speakerdeck.com/glennsarti/how-to-become-a-ships-wright-building-with-ships">Presentation</a></p>
<p><a href="https://www.youtube.com/watch?v=iX62wii_r6g">Recording</a></p>
<h2 id="reviews">Reviews</h2>
<blockquote class="twitter-tweet" data-cards="hidden" data-lang="en"><p lang="en" dir="ltr"><a href="https://twitter.com/GlennSarti?ref_src=twsrc%5Etfw">@GlennSarti</a> I've been meaning to look into SHiPS! This really helped!!!<a href="https://t.co/nV9VhVHTdz">https://t.co/nV9VhVHTdz</a></p>— Irwin Strachan (@IrwinStrachan) <a href="https://twitter.com/IrwinStrachan/status/1130043257612840960?ref_src=twsrc%5Etfw">May 19, 2019</a></blockquote>
<script async="" src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
<blockquote class="twitter-tweet" data-conversation="none" data-cards="hidden" data-partner="tweetdeck"><p lang="en" dir="ltr">The SHiPS wright talk is DAG-gum amazing. It inspired <a href="https://twitter.com/steviecoaster?ref_src=twsrc%5Etfw">@steviecoaster</a> to try it for Chocolatey. Did the recording catch you arguing with <a href="https://twitter.com/jsnover?ref_src=twsrc%5Etfw">@jsnover</a> in the Q&A? Were you "wright?" 🤣</p>— Steven Judd (@stevenjudd) <a href="https://twitter.com/stevenjudd/status/1130660495302766596?ref_src=twsrc%5Etfw">May 21, 2019</a></blockquote>
<script async="" src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
<h2 id="resources">Resources</h2>
<p>Ravikanth Chaganti</p>
<blockquote>
<p>PS Conf EU 2018 - SHiPS: Walk-through a bare-metal system configuration</p>
</blockquote>
<p><a href="https://github.com/psconfeu/2018/tree/master/Ravikanth%20Chaganti/SHiPS">https://github.com/psconfeu/2018/tree/master/Ravikanth%20Chaganti/SHiPS</a></p>
<p>SHiPS GH Repo</p>
<p><a href="https://github.com/PowerShell/SHiPS">https://github.com/PowerShell/SHiPS</a></p>
<p>SHiPS PS Gallery</p>
<p><a href="https://www.powershellgallery.com/packages/SHiPS0">https://www.powershellgallery.com/packages/SHiPS</a></p>
<p><a href="https://www.powershellgallery.com/packages?q=ships">https://www.powershellgallery.com/packages?q=ships</a></p>
<p>about_format.ps1xml</p>
<p><a href="https://docs.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_format.ps1xml?view=powershell-5.1">https://docs.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_format.ps1xml?view=powershell-5.1</a></p>
<p>Writing a PowerShell Formatting File</p>
<p><a href="https://docs.microsoft.com/en-us/powershell/developer/format/writing-a-powershell-formatting-file">https://docs.microsoft.com/en-us/powershell/developer/format/writing-a-powershell-formatting-file</a></p>
<p>SHiPS Default formatting</p>
<p><a href="https://github.com/PowerShell/SHiPS/blob/master/src/Modules/SHiPS.formats.ps1xml">https://github.com/PowerShell/SHiPS/blob/master/src/Modules/SHiPS.formats.ps1xml</a></p>
<p>Images</p>
<p><a href="https://unsplash.com">https://unsplash.com</a></p>
<h2 id="transcript">Transcript</h2>
<p><em>This is an approximate transcript of the presentation</em></p>
<div class="language-text highlighter-rouge"><div class="highlight"><pre class="highlight"><code>I tell this tale, which is stricter true,
Just by way of convincing you
How very little since things was made
that things have altered in the shipwright's trade.
In Blackwall Basin yesterday
A China barque re-fitting lay,
When an old shipwright with snow-white hair
Came up to watch us working there.
Now there wasn't a knot which the riggers knew
But the old man made it—and better too;
Nor there wasn't a sheet, or a lift, or a brace,
But the old man knew its lead and place.
</code></pre></div></div>
<p>Paraphrased from <a href="https://mainlynorfolk.info/peter.bellamy/songs/thebricklayerandtheshipwright.html">Rudyard Kipling</a></p>
<p>Shipwrights are an artisan skilled in the tasks required to build vessels. They were highly sought after back in the days of wooden ships. A little less now with modern ship fabrication techniques. But still, they were very, very highly skilled</p>
<p>However there is a certain amount of irony. That song was from Rudyard Kipling back in 1910 and ends with;</p>
<blockquote>
<p>How very little, since things was made,
Anything alters in any one’s trade !</p>
</blockquote>
<p>Now we’re probably all in the IT trade and I’m pretty sure there’s a whole lot altering in our trade! So today I’m going to help you navigate through the rough waters of building with SHiPS
The Simple Hierarchical in PowerShell provider!</p>
<p>So we’ll start the obvious question “What is SHiPS”? Obviously NOT boats.</p>
<p>ShiPS is a powershell module which makes it easier to develop PowerShell Providers. Which then begs the question what is a PowerShell provider? PowerShell Providers provide users access to the things that normally be difficult get at via the command line, for example, The Windows Certificate Store. And then presents them in a consistent known format, a filesystem. Providers have been around for a long time, since PowerShell 1 and you already using them, you just may not know.</p>
<p>You can very quickly see what providers are available by running the Get-PSProvider command. And then use file system like commands; for example Get-ChildItem on the HKEY Local Machine System registry key. And you can use the other regular commands like Get and Set Location, New-Item, Remove-Item, Move-Item and so on.</p>
<p>Ok so Providers have been around a long time and we all use them. Why would I need SHiPS then? Providers are great. They hide so much of the complexity and difficulty dealing with things like the Windows Certificate Store or SCCM. You just create a Drive and start browsing. They’re so great, that many people start going, “I want to write my own provider” So surely there’s plenty of help, documentation and examples in the community use?</p>
<p>Yeah, not so much …</p>
<p>Writing providers is hard. Here’s an example. The Microsoft Documentation has a quickstart which is nice but the first things you need to do are;</p>
<ul>
<li>Install Visual Studio</li>
<li>Install the PowerShell SDK</li>
<li>Create a class library – What’s that?</li>
<li>Create a project reference – Nope no idea</li>
<li>And start PowerShell with a command line that uses reflection; whatever that is!</li>
</ul>
<p>That is an immediate barrier for most PowerShell users. Most of us don’t even use Visual Studio</p>
<p>Next, you need to have a solid understanding of C# and the tooling that goes with it. Again, most of us don’t have that</p>
<p>And finally you need to good understanding of the inner-workings of PowerShell and Providers using the PowerShell SDK. Oh also, which versions of PowerShell will you target? Are you going to support non-windows platforms like Mac or Linux? To give you an idea of the size of the problem here, I went through the PowerShell codebase looking at the standard providers …</p>
<p>On the left is the Provider and on the right is the number of lines of code for that provider. So for example, the Registry provider has 4,373 lines of code. But look at the Variables provider. Now some people may be going, 239 lines, that doesn’t seem all that much. But if you take into account all of the inherited classes just to create a basic provider, that number jumps up considerably. That’s a LOT of C# code, well over 3000 lines you need to read and understand in order to make what we would consider “simple” provider. And this is where SHiPS comes in….</p>
<p>What SHiPS does is hide of all that really complex C# code away from you and simplifies developing providers. It also lets you create, just the bits you need, in a language familiar to you; PowerShell. Ok so now we understand why we would use SHiPS and what problems it solves. But if providers have been around for a while, then when did SHiPS become a thing?</p>
<p>The history of SHiPS starts back in 2014, so this has been 5 years in the making! (Now some of this I had to reverse engineer, so hopefully it’s all true). Back in 2014 a community member called Jim Christopher (Beefarino) created a module called Simplex which is “A powershell module used to create powershell providers using a simple DSL”. So in a way Simplex is the pre-cursor to SHiPS.</p>
<p>Not long after he also released P2F, the PowerShell Provider Framework which he describes as “The PowerShell Provider Framework (P2F) performs the heavy lifting for developing PowerShell Providers.”. From what I gather he took Simplex and removed the DSL part, which resulted in a framework which anyone could use to create any provider in C#.</p>
<p>And then sometime in 2017 the PowerShell team was working on the Azure Cloud Shell, which needed a Provider. They ended up creating SHiPS, using P2F, to create a framework so you could create providers in PowerShell script, instead of C#. In October Cloud Shell went public preview and SHiPS went open source not long after in September. Since then there have been multiple releases with the latest being 0.8.1.</p>
<p>This history is important because it sets the scene for how SHiPS works and how you write modules for it. What you see here is the architecture of SHiPS: You, the module author, writes PowerShell Classes which SHiPS uses to interact with P2F which interacts with PowerShell.cAll of that complexity below that dashed line is hidden away from you by SHiPS.</p>
<p>SHiPS makes it so easy, I can create a Provider which you can mount, and traverse like a file system in 16 lines of PowerShell (not 3000+). This is working code right here. I’ll be going through how to write a SHiPS module later on, but this is how small a provider can be.</p>
<p>So now I know more about SHiPS but what can <em>I</em> do with it? What things can I SHIPSify? There are quite a few SHiPS modules out in the wild and here are just a sample:</p>
<ul>
<li>
<p>Deepak has a DHCP Server drive</p>
</li>
<li>
<p>I have a Puppet drive which can browse a Puppet Master server. I also have a text adventure game called Pirate Booty which is a SHiPS drive</p>
</li>
<li>
<p>Ravi has a bunch including browsing the PS Gallery or Eventlog as a drive. He even has a SHiPS module to browse other SHiPS modules</p>
</li>
<li>
<p>Patrick has a drive to browse the Abstract Syntax Tree of PowerShell scripts</p>
</li>
<li>
<p>And of course the powershell team at the bottom there has the Azure Drive and CIM Drive</p>
</li>
</ul>
<p>But that doesn’t really help you. what can YOU SHiPSify? Working for Puppet it provided me with the perfect answer! Our company logo. This is a heart shaped DAG and you can “ships-ify” anything that can be represented as a DAG.</p>
<p>(And if you’re from Australia or happen to be a sheep farmer, it’s not those kinds of dags!)</p>
<p>The kind of DAG I’m talking about is a Directed Acyclical Graph. Let’s break this down.</p>
<p>What is a graph? A graph in this context is made up of vertices, nodes, or points which are connected by edges, arcs, or lines. So the nodes are the yellow squares and the connections, or edges, between them are the white lines.</p>
<p>What does directed mean? It means that the connections between nodes has a direction. Note that the connections can’t be bidirectional, with an arrow at both ends. That would be two separate edges in two different directions. It also makes it easier to traverse the graph. For example let’s say we want to get from Node A to node C, then the path we need to take is A then B then C.</p>
<p>What does acyclical mean? It means there are no cycles or circular references in the directed graph. In this example there are no cycles so it is Acyclical. But if we do this; This causes a cycle between A, B and C that goes on forever. But you <em>can</em> do this though. This creates two paths to get to C, but there is no cycle.</p>
<p>So that’s a brief introduction to graph theory and DAGs but what does that have to do with real life. What does a DAG look like out in the wild?</p>
<p>Well the FileSystem and Registry … They are DAGs – Nodes connected by edges. AD Org units and certificate store - DAGs . Azure Resource Groups - DAGs. Azure DevOps Pipelines - DAGs. The AWS CloudFormation designer displays your templates as a DAG!</p>
<p>Org units, networks, language, CMDBs, Application menus … The list is HUGE. DAGs are everywhere once you start looking! But … So what? Well, you can take a DAG and define it in PowerShell classes, which is then used by SHiPS as a provider. SHiPS just calls nodes and edges by different names.</p>
<p>This is a DAG representation of the SHiPS example documentation And what I’ve done is assigned each node and edge with the name you need to use in SHiPS. Nodes with children are called SHiPSDirectory and nodes without children are called SHiPSLeaf. The links between nodes come from the GetChildItem function. We’ll cover this in more detail in the demo.</p>
<p>Also as a side-note, I’m not sure why the designers used mixed the metaphors here. Surely it should be Parent and Child, or Branch and Leaf. Not Directory and Leaf.</p>
<hr />
<p>So let’s build a SHiPS module. We’ll build a module which we can use to see the agenda for PowerShell Summit. Before I jump into this, you should have some basic knowledge of how to write PowerShell modules and be able to read PowerShell classes. Again I’ll have some links at the end of this talk if you want to read more about this. Each step that I go through is in my github repo for this talk. So if your listening to the recording, you can follow along! Here’s a quick demo of what we are going to build …</p>
<p><em>DEMO</em></p>
<p>The first thing to do is close your computer and get out a pen and paper. We need to plan what the DAG will look like. In this case the Summit has speakers and sessions. And sessions happen on different times of the day. So I drew a quick diagram. And this is pretty much what you saw in the module demo. It may be a bit hard to see on the screen so I cleaned it up a but We have the speakers on the right listed under the speakers directory In the Agenda we can either list All sessions, or easily select only a particular day’s sessions. Now we have an idea of what we’re going to create, it’s time to start some PowerShell!</p>
<p>You’ll need the SHiPS module. You can either install it via the PowerShell Gallery or go to that link for instructions on how to build it yourself.</p>
<p>The first thing I created was the module manifest and the root object. This was was just enough code so I could import the module and create a new PS Drive. If you remember the DAG, this is the root object up the top left. So here’s a cutdown version of the Module Manifest. The only important bit here is the <code class="language-plaintext highlighter-rouge">RequiredModules</code> section. Next we create the module script. Much like that other example I showed you earlier, the amount of code you need to just create a SHiPS drive is very small. At the top we have the using statement. This is a “feature” of PowerShell classes and without it SHiPS won’t work.</p>
<p>Next we define the root object for the drive. You can pretty much call it any name you want but try and keep it small and simple. Note that it inherits from the SHiPSDirectory object. Remembering back to what I said earlier. Anything that has child objects is a Directory. Anything that doesn’t have child objects is a Leaf. Next is the constructor for the class. It takes a single parameter called name. And we then pass that to the base class to process. This is mandatory for all SHiPS objects. Even if the constructor does nothing like this one, you still need to define it. In later steps I use more complicated constructors.</p>
<p>And lastly the GetChildItem method. This gets called when user wants to get the children of this object. Self explanatory I hope. For now this just returns an empty array but we will add to this later. This is now enough code that we can now import the module and mount the PS Drive. Some people may have noticed this line at the top. For the moment don’t worry about it. SHiPS has a caching ability which I will talk about later.</p>
<p>So let’s try using this. What we want is to import the module, create a new PS-Drive and then see that there are no child items. So we import the module. Next thing we do is create a new powershell drive. I’ve wrapped this over multiple lines so you can read it easier.</p>
<p>So we create a drive called Summit2019, with a provider called SHiPS. The Root parameter is a little trickier. It has the name of your module, then a hash, and then the name of the root object. So in our case the module name is PSSummitNA2019 and then name of the root object is Summit2019. You could call your class root or something else. And the we get the child items and there’s nothing, which is exactly what we expected.</p>
<p>So now we have a drive, time to create some directories. So we’ll create the Speakers and Agenda directories first. So first we add a basic Directory called ‘Speakers’. Remember the constructor must always call <code class="language-plaintext highlighter-rouge">base</code> with a name. And then an empty child list in <code class="language-plaintext highlighter-rouge">GetChildItem</code>. And a similar directory called ‘Agenda’. Now we have the two child directories, Speakers and Agenda, we need to modify the root object to create the them when GetChildItem is called. So what we do is create an instance of the Speakers object and pass in the name of the child. This new object is then added the to array <code class="language-plaintext highlighter-rouge">$obj</code>. Then we do the same for the Agenda object. So we should now have GetChildItem return an array with two items in it. Let’s try it.</p>
<p>So like before, we import the module and create a new PS Drive. And when we get the child item we got two items called Speakers and Agenda.</p>
<p>Ok this is nice, but not really useful. Now it’s time to actually display something useful. Let’s display the speaker information. We’ll be creating these parts of the DAG next. So we need the speaker information. Fortunately, the web app for the Summit accidentally publishes the entire speaker list as a JSON file. What I did is download this file and save it in a directory that the module could find. Once I had the data file, I created some private helper functions. <code class="language-plaintext highlighter-rouge">Get-SpeakerObject</code> and <code class="language-plaintext highlighter-rouge">Remove-HTML</code> to help read and parse the information.</p>
<ul>
<li>
<p><code class="language-plaintext highlighter-rouge">Get-SpeakerObject</code> reads the JSON file and converts it into PowerShell custom objects</p>
</li>
<li>
<p><code class="language-plaintext highlighter-rouge">Remove-HTML</code> is used to strip HTML tags from text as unfortunately the data from the app is a combination of raw text, markdown and HTML markup</p>
</li>
</ul>
<p>Now we can create the Speaker object. Note that this is SHiPSLeaf not SHiPS directory because it does not have child objects. Next we define the public properties for this object. A Speaker has a Name, Firstname, Lastname and a Bio. Now when we call Get-ChildItem all of these properties will appear, not just the Name which is the default. The constructor for Speaker is a little different. We have the same <code class="language-plaintext highlighter-rouge">$name</code> like the directories before, but now we have an extra parameter called data. We can add additional parameters to the constructor however we must always pass “something” back the base object, in this case the parameter called name. In the constructor we call the <code class="language-plaintext highlighter-rouge">PopulateFromData</code> method which parses the data object and extracts the information we need to populate the public properties, Firstname, Lastname etc.</p>
<p>Note we use the <code class="language-plaintext highlighter-rouge">Remove-HTML</code> helper function for Bio. Now we could’ve stuck all the code in the <code class="language-plaintext highlighter-rouge">PopulateFromData</code> method directly in the constructor and that’s also fine. I just wanted to show you can create private functions within the class. We could’ve also put this in a normal PowerShell function as well. And you’ll see why this can be useful when I talk about testing later on.</p>
<p>Now we have a speaker object, we can modify the Speakers directory. So instead of an empty list, for each item in the JSON list, Create a new Speaker object and pass in the name and the JSON data. And this is why I have that additional data parameter in the constructor. I already have all of the speaker data right here, so why not just pass it into the Speaker object when I create it. Otherwise I’d need to parse the JSON file again EVERYTIME a speaker is created which is completely un-necessary.</p>
<p>So let’s so this in action … I’ll skip over importing the module and creating the PS Drive. Now when when get the child items we have the speakers and if we expand all of the properties of a speaker you can see the Public properties we created, Name, Firstname Bio and so on.</p>
<p>So hopefully you’re seeing a pattern here how I create this module.</p>
<ul>
<li>
<p>Create a small thing</p>
</li>
<li>
<p>Test that it works</p>
</li>
<li>
<p>Expand on that small thing</p>
</li>
<li>
<p>Test that it works</p>
</li>
<li>
<p>And repeat.</p>
</li>
</ul>
<p>So let’s expand this further. Time to add the agenda information. We’ll be creating the objects under the Agenda. So how do we get the information? Just like the session JSON files, I also needed the agenda JSON file which the PS Summit app also has. So again, I downloaded the file and created some helper functions to parse it.</p>
<ul>
<li>
<p><code class="language-plaintext highlighter-rouge">Get-SessionsObject</code> which, like the <code class="language-plaintext highlighter-rouge">Get-SpeakersObject</code>, retrieves the agenda JSON file and converts it into PowerShell Custom objects</p>
</li>
<li>
<p><code class="language-plaintext highlighter-rouge">Get-Sessions</code> allows me to return all session which match a filter which I’ll show soon</p>
</li>
<li>
<p><code class="language-plaintext highlighter-rouge">ConvertFrom-EpochTime</code> – Yeah, for some reason only known to the app developers, the timestamps in the JSON data are Unix Epoch numbers so converting them to Pacific Daylight Savings time was tricky!</p>
</li>
</ul>
<p>So again, create an object called AgendaSession as SHiPSLeaf. We then create the public properties for a session; The Session ID, Name etc.. Note the Hidden Data property at the bottom. This is how can mark private properties. So the SHiPS provider won’t display them to users. And the constructor for the agenda session. Just like the speaker object we pass in a data object too.</p>
<p>Notice the use of id at the top, not name which we used previously. We have to use the Session ID for Agenda sessions for two reasons:</p>
<ol>
<li>
<p>Session Titles may not be unique, for example the session called Lunch will appear multiple times and object names MUST be unique within a directory.</p>
</li>
<li>
<p>Session Titles may contain illegal characters for a Leaf Name. The Id is a number so it’s safe to use</p>
</li>
</ol>
<p>Ok so now we have an Agenda Session, time to create an AgendaTrackSummary directory object. The track summary object are the “All”, “Day 1” nodes on our DAG. Now the interesting thing is in SHiPS we don’t have to have a single object per node. The same SHiPS object can be used for multiple nodes. And we use this trick for the AgendaTrackSummary object.</p>
<p>So we create an object called AgendaTrackSummary. Notice that Directories can have properties too, not just Leaf objects. It has a name, the number of sessions in the AgendaTrack and a private filter property. This property is what we’ll use to know which sessions this track summary will show. For example, for the track called “All”, the filter will be empty. For the Track called “Day 1” the filter will be “Day = 1” and so on. This is how we can use the same SHiPs object for multiple nodes in the DAG.</p>
<p>Then we have the constructor. Like normal it has name. It also has the filter that this Track summary will use. And you can see how we use the <code class="language-plaintext highlighter-rouge">Get-Sessions</code> helper function down the bottom there. Makes it easier to read what’s going on. And finally, because this is a SHiPSDirectory we need the GetChildItem method. Where we get all of the Sessions based on the filter, and the create AgendaSessions for each item.</p>
<p>So now we have the AgendaTrackSummary object, time to modify the Agenda directory object to create all of the summaries. So here’s the old Agenda GetChildItem method. And the new code. So the first part which is highlighted is somewhat straightforward. The agenda has 5 track summaries. The top one called “All” has an empty filter, so all sessions
The next one down called “Day 1 – Mon” has a filter for Day = 29. Because Day 1 is the 29th of April. The Day 2 etc.</p>
<p>This next part is a little more complicated. I found in the data, there are actual talk tracks. For example this SHiPS talk is in the “PowerShell Language” track. So. The first loop there goes through every single session and finds ALL of the unique track names. The second loop then takes all of the track names and then for each item, creates an AgendaTrackSummary object with a filter of “Track equals the track name”. And with that, all of the agenda information is created so let’s see it in action.</p>
<p>Lets get the TrackSummary name and the number of sessions in each track as a table. And tada. There are 80 sessions in total. With my favourite Track Meal having 9 sessions!</p>
<p>So the AgendaTrackSummary object is working, what about the actual session information. Let’s see what the first 5 sessions on Tuesday are … Huh, not really useful. Let’s try that again</p>
<p>Let’s get all of the properties instead the default ones. Much better. Breakfast is the first session on Tuesday! There’s too much to show on one slide, but you get the idea.</p>
<p>Phew….that was a lot to take in. Right now, our module is fully functional. You can find speakers and sessions with some nice filtering. But we can make it even better. We can modify our Leaf objects to give content. So that when users use the Get-Content cmdlet it will actually give them something useful.</p>
<p>And we do that by adding the GetContent method to our Leaf classes, which returns a string. So this is the Speaker object and in it we return a markdown file of the Speakers name and their Bio. And this is the AgendaSession Leaf object. And we return some markdown text with the Sessions name, time, location. And the long description of the session.</p>
<p>So let’s see what this looks like. So we can use the Get-Content cmdlet on speakers. So this is me! And what about a session on Tuesday. Note we have to use the Session ID, not it’s name. And there’s the session information. Of course if you’re using PowerShell 6 you can use the Show-Markdown cmdlet to make this look pretty.</p>
<p>We can make the module even more useful. In the sessions it doesn’t actually say who’s speaking which was an oversight. Also in the web app version you can’t see the sessions for a speaker, which is really useful. What we’re doing here is creating additional links between leafs which is a little dangerous. Remember that we don’t want to create cycles in the DAG. So in our DAG for this module. If Speakers had links to their Sessions, and Sessions had links to their Speakers we’d end up with this; A loop! So what can we do then?</p>
<p>Well we can give “hints” but not direct links. So the session objects can give the NAMES of the speakers and speaker objects can give the Session IDs. That way a user can use cd or Set-Location using those hints So the Speaker object. We add some information. We add some public properties e.g. How many sessions the speaker has, the name and time of the sessions and most importantly the SessionIDs at the top there.</p>
<p>And then in the Populate method we use the Get-Sessions filter to find all the sessions the speaker is speaking at and populate the public properties. Now the user has more information about a speaker, with the Session IDs if they need it, and we haven’t created any loops in our DAG. Next is to modify the AgendaSession object with speaker information.</p>
<p>Again, we add a new public property called speakers. And then we populate this by going through all of the speaker IDs for the session and then finding those IDs in the Speaker JSON file. Now the user has more information about a session, with the Speaker Names, and we haven’t created any loops in our DAG. So if we have at look at my speaker information now we can see all the Session information. And if we look at a session we can see the speaker names.</p>
<p>So the module is working fine, but it doesn’t look very nice. The default properties that are showing aren’t that useful. So let’s make it look pretty. PowerShell uses XML formatting files to tell PoweShell how to display objects. I’m not going to into detail about this as there’s plenty of other blogs and documentation on it. But you can start with the powershell help system and about_Format.ps1xml. And there’s probably people here in this very conference that can help you too!</p>
<p>So we create the formatting XML file and then we can make the module use that in the Module manifest by specifying the <code class="language-plaintext highlighter-rouge">FormatsToProcess</code> setting. And let’s see some before and after comparisons</p>
<p>So let’s get the first 3 speakers as a table. This is what it used to look like … and now it looks a lot nicer and shows the most common information you need right away. No need to use Select-Object to get the properties you want.</p>
<p>What about the sessions … Yeah, not useful at all. But now … Much better!!!</p>
<hr />
<p>And that is how to create a SHiPS module for the PowerShell Summit Agenda. I went through a lot of things, so I’ll quickly recap.</p>
<ul>
<li>
<p>Start with some planning – Draw a picture of what the module will show the user. This is the DAG. The DAG may change WHILE you’re developing the module and that’s ok. But I find it’s really useful to have some idea of what the module will look like before I start writing PowerShell.</p>
</li>
<li>
<p>Create the root object first, Then create the directories, Then create the leaves. Don’t try and do them all at once</p>
</li>
<li>
<p>A process I found REALLY useful was to make small changes and test that they work. And then repeat that loop. It’s easy to be overwhelmed when first starting out and this loop really helps to stop being overwhelmed.</p>
</li>
<li>
<p>And lastly don’t worry about making it pretty at the beginning. First get it working and then make it pretty</p>
</li>
</ul>
<p>Alright what about some more advanced information about SHiPS</p>
<hr />
<p>So my example module used static data files which is fine. But what about using remote services, like an API. Well, it’s all PowerShell so whatever you can do in PowerShell you can use in SHiPS. So if you can query it with Invoke-WebRequest or Invoke-RestMethod. Here’s an example of another SHiPS provider I wrote. It calls the GitHub REST API and present github repos as a filesystem. So the github view on the left and the provider view on the right there. And this is the master branch of the puppetlabs-powershell module. Note how the look VERY similar. Which brings us to the next topic …</p>
<p>How do we manage credentials for a provider. You need credentials for the Github module I just showed you. The Azure SHiPs module does too. Currently the SHiPS provider doesn’t allow you to pass credentials from the New-PSDrive cmdlet. And you’ll end up with this lovely error message and be sad. So what <em>can</em> you use then? you have a couple of options:</p>
<ul>
<li>
<p>Firstly add your comments and plus ones to github issue 110 in the SHiPS repo so the maintainers know this is something we want!</p>
</li>
<li>
<p>You can store tokens in environment variables, like I did for my github provider</p>
</li>
<li>
<p>But you could also store the authorization information in a file or in the registry.</p>
</li>
<li>
<p>Pretty much anything outside the “realm” of PowerShell. In particular global or module scope variables</p>
</li>
</ul>
<p>Which seems like an odd thing to say but this is due to an architecture decision in SHiPS</p>
<p>The variable scoping or more specifically the runspaces used in SHiPS will trip you up. Took me ages to debug this. When SHiPS creates a drive, it also creates it’s own runspace for that drive to operate in. That means exported functions, or global functions will appear in both the default runspace, which is where you import your module into, but there will be an independent copy when the PS Drive get’s created. I’ll run you through an example …</p>
<p>So we’re in PowerShell. Let’s import our github SHiPS module. We import the module and it sets up a global variable called GithubToken so we can store our API token to talk to Github. Next then set our Github Token using the Set-GitHubToken cmdlet to abc123. Excellent, let’s create our new PSDrive and use our new token. We create our new PS Drive, and behind the scenes it creates a runspace and imports our github module into that. Note now we have two variables called $Global:GitHubToken and two functions called Set-GitHubToken. And importantly the drive runspace DOESN’T have our github token in it.</p>
<p>Anything we do with that drive e.g. cd, getchilditem, get-content happens in the drive runspace. And it will have an empty token. Even IF we call Set-GitHubToken again, it get’s called in the default runspace context. The drive runspace is effectively hidden from us. This is why you need to store credentials outside of a runspace. E.g environment variables.</p>
<p>Earlier I mentioned that SHiPS has a caching mechanism. The SHiPS cache is turned off by default, however when turned on SHiPS will cache the directory tree as it gets used which makes it great when users are traversing up and down the directory tree. Of course if you want to, you can use your own caching mechanisms too. You can turn it on using the SHiPSProvider attribute on a class.</p>
<p>SHiPS has other features, some of which you may have used on other Providers … Earlier I showed you how to use Get-Content to get the Speaker and Agenda content. SHiPS also supports the Set-Content command too so you can write information. So for example, what if I could update my own Bio text, I could use Set-Content to do that.</p>
<p>We’re probably all used to using the pipeline to filter, but sometimes it’s preferable to do the filtering at the provider level. SHiPS supports filtering on the SHiPSDirectory object. So here I’m getting all the speakers and using the filter of “Glenn*”. The top one would be using the pipeline, and the bottom example is using SHiPS filtering.</p>
<p>Dynamic Parameters are additional parameters passed from GetChildItem. For example I could define a dynamic parameter called Track which takes a value of General. So it would return all sessions that are in the General Track. Or Perhaps all the talks on Day 2, or from the Speaker called Glenn. Or you could combine them together for example all General Tracks on Day 2. There’s a lot of content here and the best place to get more information from the DynamicParameter sample in the SHiPS github repository.</p>
<p>Ahh Testing… a topic dear to my heart. Yes you can test your SHiPS module using Pester, remember this is just PowerShell classes and functions and you can test it like any other PowerShell module.</p>
<hr />
<p>So you’ve used SHiPS and you like it, but it has some significant limitations right now;</p>
<ul>
<li>
<p>Doesn’t support passing credentials.</p>
</li>
<li>
<p>It only has a tiny list of supported cmdlets. So you can’t use New-Item or Remove-Item etc.</p>
</li>
<li>
<p>It also doesn’t pass through the drive information into the ProviderContent. So you have no idea where in the path a SHiPS object is, or provider full paths for hints.</p>
</li>
</ul>
<p>BUT, you can make changes to SHiPS, because it’s open source however the SHiPS project is still quite young in terms of an Open Source Project. Why does this matter? Well, it’s still difficult to contribute and figure out how the project works. For example, there are no tags in the github repo, the changelog is incomplete, it uses a different git branching workflow than the rest of the other official PowerShell projects I’ve seen.</p>
<p>There are no unit tests for SHiPS or P2F, only integration tests. Makes it difficult to make changes with confidence. The test suite is still Azure Cloud Shell focused. The reference documentation is incomplete however the narrative document is excellent.</p>
<p>I have high hopes that Microsoft make more time to invest effort into this project; to make it easier for the community to contribute!</p>
<h3 id="wrapping-up">Wrapping up</h3>
<p>So wrapping up ….</p>
<ul>
<li>
<p>SHiPS is a User Experience module. It wants to make the life of the user easier; so always remember: what will the USER experience</p>
</li>
<li>
<p>If you feel overwhelmed when you start; make small changes and test it and loop.</p>
</li>
<li>
<p>Read the documentation. it is useful and has full examples of the advanced features</p>
</li>
<li>
<p>And always remember the DAG. Graphs are everywhere.</p>
</li>
</ul>Glenn SartiHow to become a SHiPS wright - Building with SHiPS