Saturday 30 September 2017

Project Honolulu "as a server"


So with MS Ignite behind us with and with some great new revelations, I thought I'd take a look at the newly announced "Project Honolulu".  Project Honolulu (which can be downloaded here, is going to be the replacement to Microsoft's administrative MMC snap-in tools like Device Manager, Disk Manager etc.  Its not replacing RSAT tools or the system center suit but to compliment them and fill the void in simply and basic administration of server management particularly for Windows Server Core.  Before I begin, please remember that any thoughts/opinions in this blog are my own and are not influenced in any way but feel free to comment - I love "nerd talk", and any code posted comes "AS-IS" and you, the executor are responsible for your own stuff.

Project Honolulu can be installed on Windows 10 as an application. This will then install as running process and rely on the application to be running in the backgroud. The alternative method is to install Project Honolulu on a server which will install as two components, a gateway (collecting data from server nodes) and a webserver (to present the webapp).  Today I'm going to look at a server installation.
 
Environment
To get PH up and running in gateway mode you will need…. A server.

Host
Hyper-v
CPU
2
Memory
2-4GB Dynamic
HDD
40GB vhdx
OS
Windows Server 2016

Simples!

Installation
My first instinct was to secure the webserver with my on enterprise CA so, using the remote certificate snap-in (ironically) I generated a cert request file. There are other methods i.e. certreq.exe but this for me was the quickest.  Logged onto my CA and completed the request. I then imported the certificate to the servers personal certificate store.

Now I launched the MSI and agreed to the licensing agreement. I also agreed to allow my server the ability to update its TrustedHosts list.  This is needed for managing workgroup servers.  After this, I was prompted to provide a port for the web application and either allow the application to generate a self-signed certificate or specify one of my own.  As I've generated my one certificate I chose to use that.

I specified my preferred port and took the thumbprint of my generated certificate and entered it. Then hit my first hurdle… :(

Hmmm? 

Attempted to install with other ports, registered and ephemeral but with no joy? With very little details on the webserver and its configuration, my troubleshooting was pretty limited.  Just as a off-change (and time being against me) I decided to install the application using the built-in self-sign function, which installed with no errors…. Odd, thoughts @Microsoft.

After this, the installation went through with no further issues.  Upon completion I opened a web browser and browsed to https:\\<PH SERVER UNC>:<CONFIGURED PORT>.  There's a short tour to skip through and that's it.



Add Managed Endpoints

At this point I am now ready to add endpoints to be manage.  By clicking on the plus sign on the right-hand side you can enter server details and credentials to be used to manage it with, awesome!  At which point the PH server will gather data and manage the server using the credentials provided.

Alternatively, you can give the Honolulu servers' AD object the permission to the able to manage the endpoints.  As the Honolulu service runs as the NT SERVICE\Network Service, configuring the computer account with delegate permissions to manage the endpoints allows the endpoints to be managed automatically.   However, my immediate reaction to this was that there appears to be no sense of roll based access control so if a Honolulu server has access to manage other servers on a network, Sys admins would quickly lose sight on what admins can do…. For example, a low-level summer intern would have the same abilities as a second-line support engineer.

This, thankfully, is not the case.  The NT SERVICE/Network Service simply discovers the endpoints.  The administration of such endpoints is executed via remote PowerShell and WinRM but the admin executing the commands still have their credentials passed through.  So, as long as our low-level summer intern doesn’t have the rights to shut down that business critical ERP server, they won't be able to using Honolulu.

To enable the Honolulu server delegate permissions to discover the end points, the Honolulu server needs to be added to the end points PrincipalsAllowedToDelegateToAccount setting.  To do this MS have kindly documented this here.  I however, have taken it one step further.

$gateway = "HONOLULU SERVER"
$gatewayObject = Get-ADComputer -Identity $gateway

$nodeObjects = Get-ADComputer -Filter * -SearchBase "TOP LEVEL AD OU OF ENDPOINTS"

foreach ($nodeObject in $nodeObjects){

    Set-ADComputer -Identity $nodeObject -PrincipalsAllowedToDelegateToAccount $gatewayObject
    $output += ($nodeObject.DNSHostName + ",")
}
$filePath = $Env:USERPROFILE  + '\Desktop\UploadToHonolulu.txt'
$output | Out-File -FilePath $filePath -NoClobber

This PS script will add the Honolulu to all servers in a desired OU, then take the FQDNs of all endpoints and compile them into a .txt file which is outputted to the user's desktop.  Open the Honolulu server and import all the servers by importing the .txt file

"Voila!"


Managing an End Point
Now that your endpoints have been discovered, simply click on to it to see the vast amount of administration that can be done from this simply console.  Along with a simply but affective overview of the endpoints performance, its quick to see how this tool will help many sys admins going forward.



Conclusion
It has been a while since Microsoft have developed a tool that fills a genuine gap.  Since the uptake on Windows server core, a number of companies having the confidence to deploy it to production enviornments has been slow due to lage learning curve needed to achieve simple/basic administration task on an endpoints.  MMC snap-in fills the majority of that gap but not entirely and it is clunky at best.  With project Honolulu Sys admins can now preform most (if not all administration) task from a web console….

Good Work Mr. Microsoft!

Further info found here:


 

Wednesday 13 September 2017

SSDT - to script or not to script!

I have been using SSDT for years, through its various incarnations, and I am a huge fan. I can say I have fond memories of Gert the Data Dude posting his way to blogger awesomeness and me being extremely grateful that he did. Gert has moved on to other parts of the Microsoft universe but the product has survived and seems to be a fully-fledged senior citizen in the Visual Studio landscape. Worryingly, Visual Studio has also started to entertain a new suitor, Red-Gate, and their devops offering is quite different from the model-based SSDT project...we shall see what happens there.

Anyway, the reason for the post is that I have just learned something rather interesting about how SSDT, VS, MSBuild and SqlPackage.exe co-operate to get scripts added to the beginning and end of a database change script.

The Requirement:
I have just started using tSQLt to write database unit tests after years of using the SQL Server Unit Test in the Visual Studio Test Project and my plan is to integrate the two different frameworks so that I can benefit from the fakes and assertion utilities in tSQLt but still have all my tests visible in the Visual Studio Test Explorer. I needed to have tSQLt deployed as part of the database project to make this happen and I wanted it to be extremely easy to upgrade tSQLt when a new version is released.

The Plan:
Add the tSQLt.class.sql downloaded from tSQLt.org as a Post-Deployment script and have the project decide whether to include it based on a project variable. Sounds simple but there is a catch - you can't simple add some conditional logic to the Post-Deployment script like this:

IF ('$(UseTestFramework)' = 'true')
:r .\tSQLt.class.sql

The Problem:
It would be nice if you could but by adding a TSQL statement to the script SSDT treats the tSQLt.class.sql as embedded TSQL and throws an exception because the file is crammed with GO statements. So you may try this:

:r .\$(TestFrameworkFileName).sql

In this case the sqlcmd variable value can be set differently in each environment publish settings file and for the environments where it is not needed an empty substitute file can be used. The problem is that SqlPackage.exe uses the DEFAULT value of the sqlcmd variable to evaluate the expression, not the value set in a publish settings file; so you end up with the the same result whatever you do.

The Solution:
It is similar but with a twist: you need to set the value of a sqlcmd variable in the database project file using an MSBUILD variable that can be determined at build time. The legendary Gert describes the solution here: https://social.msdn.microsoft.com/Forums/sqlserver/en-US/745a2189-62db-4739-8437-8aac16affa05/conditional-post-deployment-for-unit-test?forum=ssdt

So, the steps to use conditional logic to determine if a particular script is included are quite simple:

  1. Add a database project variable named Configuration
  2. Unload the project
  3. Set the value of the variable to $(Configuration) - that's the MSBuild variable
  4. Reload the project
  5. Add a Debug.sql and a Release.sql file as post-deployment scripts
  6. Use the $(Configuration) sqlcmd variable in the post-deployment script to include the correct file based on the configuration of the build
The downside is that your optional execution paths are determined by the number of configuration types you have, rather than by the content of your publish settings file...but it is better than nothing!