Automating Repository Creation in Azure DevOps

Background

In a project that is very heavy on databases and ssis projects the decision was made to split up the large system into smaller modules. It was also decided to keep each module in its own repository in the project.

When we started this project, we had in the project 25 different modules, the layout of each module was the same, a database, a ssis project and a webpage. It was decided to host this in Azure DevOps. Setting up 25 repositories that is identical from the start is not something I would like to do manually and if there were 25 modules it’s likely there will be more.

The decision was made to keep the build yaml pipelines in its own repository, this so all the pipelines could share resources like PowerShell scripts and test.

The first 25 repositories were created with AZ CLI and PowerShell. It was just one person that had the knowledge and the right to do it. This is not optimal and though the script was made easier to run it still was not optimal since others didn’t feel they had the knowledge to do it. Implementing a pipeline in the project would facilitate the straightforward creation of new modules by a broader range of contributors.

So, what should the pipeline do?

  • Get a name for the module when triggering the pipeline.
  • Create a repository with the given name.
  • Set the folder structure for the module in the created repository.
  • Create and registry a “build pipeline” in the pipeline repository.
  • Configure Policies for the repositories and Areas for the boards.
  • If something fails clean up what has been done.

Having most of this in PowerShell already with AZ cli and working, how much trouble can it be?

When designing the pipeline it was split into five stages.

  • Setup and Validation.
  • Create the repository and folder structure and default files.
  • Create the pipeline.
  • Create administration settings.
  • Cleanup (in case of failure).

Since this was one big PowerShell script, the work started by breaking up the script into smaller scripts at the same time I built up the pipeline using the PowerShell@2 task in the pipeline.

Create repository

Since this is the main purpose of the pipeline this is where the work started. Starting small, just create a minimal pipeline that creates a repository.

- task: PowerShell@2
  inputs:
    targetType: 'inline'
    script: |
      az repos create --name ${{ parameters.ModuleName }} --project "Demo" --organization "https://dev.azure.com/xxxx" --output json 2>&1
    showWarnings: true
    pwsh: true
  env:
    AZURE_DEVOPS_EXT_PAT: $(System.AccessToken)

ERROR: TF401027: You need the Git ‘CreateRepository’ permission to perform this action. Details: identity ‘Build\xxxxx-xxx-xxx-xxx-xxxx, scope ‘project’.

Almost success the build pipeline needs the rights to create repositories in the project. Project Settings – Repositories – All Repositories – Security and change “Create repository” to “Allow”.

Then the problem started….

Even though the rights for the Build Service were set correct it still said It missed the Create Repository rights.

I was stuck since if you need to do something in a Pipeline that needs a user right you can use the system access token. So why didn’t it work?
The system access token is the “pat” so that the build services can access resources is Azure Devops.

Spending time searching and asking ChatGPT and copilot and basically getting the same suggestions back all the time in what to do. It was first when I looked at the pipeline settings and noticed the settings “Protect access to repositories in YAML pipelines”

If “Protect access to repositories in YAML pipelines” is enabled (it’s enabled by default) the pipeline is limited to do things only in the repository where its located!

“Protect access to repositories in YAML pipelines” has the info “Also, generate a job access token that is scoped to repositories that are explicitly referenced in the YAML pipeline.” This means that we can’t go outside the repository where the pipeline is located in, we need to explicitly say in the pipeline that we will need to access a different repository. But the requirement for the pipeline was that the name of the repository should be set at execution of the pipeline.

So, what to do?

Asking ChatGPT with all this info it made the suggestion to use a Service Connection to azure. Exploring this option looked like the right way. The recommended way to create a service connection to Azure is in the “Project settings” in the “Pipeline section” klick “Service Connections” and then click “New service connection”. The connection to choose is “Azure Resource Manager”. The easiest is to choose the defaults (if you have the rights in azure)

Leave the Identity type and Credential to the default. Choose the scope level you want to use. Choose what resource group and give it a name to you standard. Choose if you want to give it access to all pipelines. This will create a connection and an app registration. You can then add Azure Devops to the App you can then add to your Azure Devops as a user and give it rights in the project like a normal user. The app registration will not have the same name as the Service connection.

This was the app that was created from Azure Devops service connection. I changed the group to Readers to give it minimal rights and then add only the rights in needs in the project. I added the rights to create repositories to the user in the project Demo

Changed the pipeline

- task: AzureCLI@2
  inputs:
    azureSubscription: 'AZDO-Demo-SC'
    scriptType: 'pscore'
    scriptLocation: 'inlineScript'
    inlineScript: |
      
      az repos create --name ${{ parameters.ModuleName }} --project "Demo" --organization "https://dev.azure.com/xxxx" --output json 2>&1

SUCCESS! The pipeline created the repository in the project. The small downside is that we need to use a license in Azure Devops to use this solution. I haven’t found another solution if the “Protect access to repositories in YAML pipelines” setting can’t be removed.

Using the AzureCLI@2 task with the input “azureSubscription: ServiceConnectionName” you can run Azure CLI commands against Azure Devops. The reason is that it will automatically login using the service connection with AzCLI tasks. Since the “user” needs access to the project it’s needed to add the service connection app as a user in the Azure Devops organization and to the project.

Git operations

Now that the pipeline can create the repository it’s time to prep the new repository with the default folder structure and files that all modules use. Let’s start with using a PowerShell task

- task: PowerShell@2
  inputs:
    targetType: 'inline'
    script: |
      Write-Host "Cloning repo: $env:REPO_NAME"

      # Construct the repo URL (adjust org/project as needed)
      $repoUrl = "https://dev.azure.com/psadi/Demo/_git/$env:REPO_NAME"

      # Use System.AccessToken for authentication
      git -c http.extraheader="AUTHORIZATION: bearer $env:SYSTEM_ACCESSTOKEN" clone $repoUrl
  env:
    SYSTEM_ACCESSTOKEN: $(System.AccessToken)
    REPO_NAME: ${{ parameters.repoName }}


The result wasn’t good

remote: TF401019: The Git repository with name or identifier Module_T does not exist or you do not have permissions for the operation you are attempting.

fatal: repository ‘https://dev.azure.com/xxxx/Demo/_git/Module_T/’ not found

And we are back to having errors that all say: don’t exist, but it does since it was just created. Since PowerShell didn’t work. Can we use same strategi as in the create repo task.

- task: AzureCLI@2
  inputs:
    azureSubscription: 'AZDO-Demo-SC'
    scriptType: 'pscore'
    scriptLocation: 'inlineScript'
    inlineScript: |
      $repoName = "${{ parameters.repoName }}"
      Write-Host "Git Clone"
      git clone https://dev.azure.com/psadi/Demo/_git/$repoName

  One small step closer but still not able to clone.

fatal: could not read Username for ‘https://dev.azure.com’: terminal prompts disabled

Can the service connection we created earlier? The answer seems to be yes. The pipeline should be able to request a bearer token from the service connection by

az account get-access-token --resource 499b84ac-1321-427f-aa17-267ca6975798


Where the guid is the same for everyone!
This bearer token is a short-lived PAT for the service connection and since it uses az the pipeline will be using the AzCLI task

- task: AzureCLI@2
  inputs:
    azureSubscription: 'AZDO-Demo-SC'
    scriptType: 'pscore'
    scriptLocation: 'inlineScript'
    inlineScript: |
      $token = az account get-access-token --resource 499b84ac-1321-427f-aa17-267ca6975798 --query accessToken -o tsv
      Write-Host "Cloning repo: ${{ parameters.repoName }}"

      # Construct the repo URL (adjust org/project as needed)
      $repoUrl = "https://dev.azure.com/xxxx/Demo/_git/${{ parameters.repoName }}"

      # Use System.AccessToken for authentication
      git -c http.extraheader="AUTHORIZATION: bearer $token" clone $repoUrl

The pipeline is all green.

But we don’t have to run Git operations with AzureCLI task. We can save the token as a variable in the pipeline so that we can use it in later tasks.

- task: AzureCLI@2
  name: SCToken
  inputs:
    azureSubscription: 'AZDO-Demo-SC'
    scriptType: 'pscore'
    scriptLocation: 'inlineScript'
    inlineScript: |
      $token = az account get-access-token --resource 499b84ac-1321-427f-aa17-267ca6975798 --query accessToken -o tsv
      Write-Output "##vso[task.setvariable variable=AdoSCToken;IsSecret=true]$token"

- task: PowerShell@2
  inputs:
    targetType: 'inline'
    script: |
      $token = "$(AdoSCToken)"
      Write-Host "Cloning repo: $env:REPO_NAME"

      # Construct the repo URL (adjust org/project as needed)
      $repoUrl = "https://dev.azure.com/psadi/Demo/_git/$env:REPO_NAME"

      # Use System.AccessToken for authentication
      git -c http.extraheader="AUTHORIZATION: bearer $token" clone $repoUrl
  env:
    REPO_NAME: ${{ parameters.repoName }}

To summarize

If the Azure Devops project has the setting “Protect access to repositories in YAML pipelines” enabled, the pipeline is locked down to its own repository if you don’t explicitly set what repository you want to use in the resource section

resources:
  repositories:
    - repository: aliasRepoName    # alias you choose
      type: git                    # Azure Repos Git
      name: MyProject/Repo         # project/repo name
      ref: main                    # branch to checkout

steps:
- checkout: aliasRepoName

 Using this will allow the pipeline to access that repo. But if the pipeline needs to use a dynamically repository name that is set when the pipeline is running, then there isn’t any way to set it like above.

When the pipeline gets the repository at runtime, one way to solve it is by using as service connection and authenticating to Azure Devops with that account. Could also use a PAT but then the solution is based on a user that needs to rotate the PAT from time to time. Using the service connection setup, that problem doesn’t exist. The service and build accounts require custom permissions throughout the solution. It took a lot of running and changing one thing and run it again to see if the error was solved.  That also goes for should a task be PowerShell or AzureCLI task. If you need to use the system.accesstoken or the token from the service connection. It was a frustrating but a very learning experience.

DBATools 2.x certificate chain issues

If you are moving from DBATools 1.x to 2.x the way DBATools connec to the servers has changed. DBATools are using Microsofts new drivers to connect and the driver will by default check the certificate of the server to use a encrypted channel to talk to the server. If you haven’t enabled the SQL Server to use Certificate when creating the connection you have the choice of first doing

$Conn = Connect-DbaInstance -SqlInstance xxx -TrustServerCertificate

This will connect to the server and trust its self signed certificate. Or you can set DBATools to always trust the certificates.

Get-DbatoolsConfig -FullName sql.connection.trustcert # should see the value set to false
Set-DbatoolsConfig -FullName sql.connection.trustcert -Value $true

This will set the config value for the current session. To set it permanent you will have to register the settings.

Get-DbatoolsConfig | Register-DbatoolsConfig

Remember, if you do the second option and if you later add real certificates to the sql server. You have to change the setting back

Reset-DbatoolsConfig -FullName sql.connection.trustcert

syspolicy_purge_history

Multiple SQL instances problem

If you have multiple sql instance on the same server or cluster you might end up with a problem with the SQL Agent job syspolicy_purge_history. The job doesn’t fail but it might try to login to the wrong sql server instance on your server. I found this on one of my clients installation and when I tried it in my lab setup I had the same issue. This was tested on SQL Server 2019.

The issue is well described [here](https://www.codykonior.com/2015/05/31/login-errors-with-syspolicy-purge-history/) by Cody Konior

I have been using Codys solution without any problem on the instance where this issue have been a problem. So creds go to Cody for the below solution.

Exec	msdb.dbo.sp_update_jobstep @job_name = 'syspolicy_purge_history', @step_name = 'Erase Phantom System Health Records.', @step_id = 3, 
		@command=N'$applicationName = "SQLPS ($env:USERNAME@$env:COMPUTERNAME)"
$SQLServerConnection = New-Object System.Data.SqlClient.SqlConnection
$SQLServerConnection.ConnectionString = "Data Source=$(ESCAPE_NONE(SRVR));Initial Catalog=master;Integrated Security=SSPI;Application Name=$applicationName"
$PolicyStoreConnection = New-Object Microsoft.SqlServer.Management.Sdk.Sfc.SqlStoreConnection($SQLServerConnection)
$PolicyStore = New-Object Microsoft.SqlServer.Management.Dmf.PolicyStore ($PolicyStoreConnection)
$PolicyStore.EraseSystemHealthPhantomRecords()
'
Go

If you are not using policys this problem is just about a failed login on a different sql instance. But for me that is annoying enough to fix it.

It should be noted that you can just disable the step if you aren’t usin policys.

Microsoft VPN Split tunnel

While setting upp a VPN to my home so that I could access my home network and the lab servers I have there, I wanted to use split tunneling to avoid sending all traffic to over my VPN.

To do this I used Powershell and the following code

$VPN = @{
    L2tpPsk               = 'Your Shared secret key'
    Name                  = 'Connection name'
    ServerAddress         = 'x.x.x.x'
    AuthenticationMethod  = 'MSCHAPv2'
    EncryptionLevel       = 'Maximum'
    SplitTunneling        = $True
    TunnelType            = 'L2TP'
    UseWinlogonCredential = $false
    RememberCredential    = $True
}

$VPNRoute = @{
    ConnectionName    = $VPN.Name
    DestinationPrefix = 'y.y.y.y/24'
}

Add-VpnConnection @VPN -force
Add-VpnConnectionRoute @VPNRoute

I choose to use splatting to make it easier to read and I think the code is easier to read also. I don’t use the windows credential in this connection so I have to start it and set the user and password to be able to connect.

The route is needed for me to access the local network at home since I use a split tunnel. If I had more networks behind the VPN you can add more routes.

SSRS – Kerberos Authentication double hop

Reporting Services normally uses NTML as authentication when you install it but you can set it up for Kerberos Authentication with some work. The account that SSRS will use, can be one of the following three.

  • Virtual Account
  • Network Service
  • Domain Account

Of these three the Virtual Account will not share resources with other applications. The Network Service Account will be shared with other resources. The Domain Account can share but that is up to the admin to decide. So the best options is to choose between Virtual Account and Domain Account

To see how to setup Kerberos Authentication for SSRS a new lab was created with a Domain Controller, two SQL servers, two SSRS servers, two clients one Windows 8 and one Windows 10. Two service accounts where created RS1 and RS2.
SQL Server was installed with SQL 2019 and Reporting Services was 2019 stand alone installation.
In the domain a group policy was created to have webpages that uses the fully qualified domain name aka *.my.lab

Zone assignment

SSRS1

Reporting Services was setup with a virtual account for the service and a domain account to connect to the Reporting database. The domain account RS1 was used due to the fact that SSRS recommends that you don’t use a virtual account, aka computer account, to connect to the SQL server. If you use the virtual account to connect the report database it can lead to problems.

On the SSRS1 servers computer account the delegation was set to “Trust this computer for delegation to specified services only” and it was delegated to SQL1 that holds the database that we will run the report against.

No SPN record was created since it will use the HOST SPN

SSRS2

SSRS2 server was setup with a domain account RS2 for the Reporting Server service and the same account was also used to connect to the SQL Server for the reporting database.

For the domain account RS2 you need to create SPN record. By the documentation at Microsoft Docs SPN records was created of type HTTP/ComputerName and HTTP/ComputerName.domain. The SPN records where created by

SETSPN -S HTTP/ssrs2.my.lab rs2
SETSPN -S HTTP/ssrs2 rs2



The SPN records needs to be created before you can access the delegation tab for the user. The delegation was setup as the same way as for SSRS1 computer account in the image above. If there isn’t a delegation tab for the user account then the SPN records hasn’t been created correctly or you haven’t reloaded the domain.

For both SSRS servers the config file rsreportserver.config was modified. In the <Authentication> section <RSWindowsNegotiate /> was added so it looks like the code section below

<AuthenticationTypes>
	<RSWindowsNegotiate />
	<RSWindowsNTLM/>
</AuthenticationTypes>

After the change to the config file the servers was restarted and client was restarted.

In order to see that we are connecting with Kerberos or NTML you can use the following query

SELECT  s.session_id
	, c.connect_time
	, s.login_time
	, s.login_name
	, c.protocol_type
	, c.auth_scheme
	, s.HOST_NAME
	, s.program_name
FROM sys.dm_exec_sessions s
  JOIN sys.dm_exec_connections c
    ON s.session_id = c.session_id

Edge Chromium

HTTP://ssrs1/reports can access the report server and access the report that was created that queries SQL1.

HTTP://ssrs1.my.lab/reports can access the report server and access the report that was created that queries SQL1.

HTTP://ssrs2/reports can access the report server and access the report that was created that queries SQL1.

HTTP://ssrs2.my.lab/reports can access the report server and access the report that was created that queries SQL1.

The result of the above TSQL query was that all Edge Chromium was done over Kerberos authentication

IE 11

HTTP://ssrs1/reports can access the report server and access the report that was created that queries SQL1.

HTTP://ssrs1.my.lab/reports can access the report server and access the report that was created that queries SQL1.

HTTP://ssrs2/reports can access the report server and access the report that was created that queries SQL1.

HTTP://ssrs2.my.lab/reports can access the report server and access the report that was created that queries SQL1.

The result of the above TSQL query was that all Edge Chromium was done over Kerberos authentication

All looks good and we are done, well that is until someone needs to do some remote administration to SSRS2 like adding the server to Server Mananger

Server Manager error
Windows Admin error
PowerShell 7 Error

The above errors are due to the fact that we have said that HTTP is used by the RS1 account and not by the Computer Account.
To solve this is to remove the SPN we just created. But when we remove the SPN Kerberos double hop will stop working.

SETSPN -D HTTP/ssrs2 rs2
SETSPN -D HTTP/ssrs2 rs2

According to SPN documentation you could add port to the SPN, just using http in the test

SETSPN -S HTTP/ssrs2.my.lab:80 rs2
SETSPN -S HTTP/ssrs2:80 rs2

After adding the above SPN restart the server and the client used in the test. This time when accessing http://ssrs2/reports and http://ssrs2.my.lab/reports a login prompt is displayed. So even if the SPN documentation says you can use ports it will not work with SSRS at least. Though this solution solved the PowerShell remote problem.

CNAME

Another solution that could work is to use a CNAME to access the SSRS. After the CNAME report2 is created for ssrs2 the SPN should be created

SETSPN -S HTTP/cname.my.lab rs2
SETSPN -S HTTP/cname rs2

After a reboot to server and client (just to make sure) the test in Edge Chromium results in that we again gets a login prompt. Looking in the clients event log System shows an error for Source-Kerberos

Kerberos error in System log

A solution for this error when using CNAME I haven’t found so far.

A-Record

Instead of using a CNAME an alternative is to use a different A-record. Using an A-Record means that you have to use static IP addressing to the server either by do a static DHCP record or by using static IP on the server.

After the A-Record in the DNS server is created, the SPN record needs to be created on the service account

SETSPN -S HTTP/arecord.my.lab rs2
SETSPN -S HTTP/arecord rs2

Accessing the report server now by using http://arecord.my.lab/reports and http://arecord/reports works without any problem.
Accessing the report server by its hostname will result in a login prompt that will not go away.

Conclusions

So far my testing of SSRS and double hop is that using the virtual account with Kerberos delegation on the computer account is the easiest way to achieve the double hop to the source from the SSRS server. If domain account must be used for the service account then an other A-record is needed to not break remote administration.

SQL Server 2012 SQLPS.exe problem

For security reasons it was decided to remove PowerShell 2.0 on our servers and install PowerShell 5.1. It was tested on a few server but not with SQL 2012 installed on it. After PowerShell 2.0 was removed it was noticed that the agent job “syspolicy_purge_history” wasn’t working correctly. Further research show that SQLPS.exe was unable to start and crashed with an error. The problem didn’t occur on SQL 2014 and higher.

Working with Microsoft it was concluded that there where no problem with the server or PowerShell and the problem must be related to SQLPS.exe or the SQL server. When comparing a Window server 2016 that had both SQL 2014 and SQL 2012 installed and the agent job worked on SQL 2014 but not on SQL 2012. It was discovered a difference at folder “C:\Program Files (x86)\Microsoft SQL Server\110\Tools\Binn” and “C:\Program Files (x86)\Microsoft SQL Server\120\Tools\Binn”. For SQL 2012 a file called SQLPS.exe.config was missing. Copying the file from SQL 2014 to SQL 2012 it all started working.

Pester

If you are using PowerShell and wants to test the code you are writing so that your functions is working as you expect, you should start to look at Pester.

Don Jones and Adam Bertram has a very good book about Pester on LeanPub

Here I will share my problems and learnings about Pester.

When I started reading the book I had Pester 3.4 installed as default by Microsoft on a Windows 10. But the pester test from the book was failing for me. The answer was to upgrade Pester to the latest version. After that the Pester tests worked perfect!

 

Windows 8 add shutdown, restart and logoff tile to windows 8

I’m trying out Windows 8 on my spare laptop and while I find Win8 to be good and solid OS I like many others miss the start button. Though I dont miss it that much what I miss more is an easy way to shutdown and restart. Of course you can hit ctrl-alt-F4 to get the shutdown/restart windows.

I wanted an easier way and while looking into another script I found this script at technets script gallery. This Powershell module allows you to create tiles for shutdown, restart and log off in the windows 8 start menu.

Error 2738. Could not access VBScript run time for custom action.

Found the solution below on this blog

Fixing Error 2738 on Windows 7 64-bit

If you have a 64-bit operating system, you will need to follow these steps:

Open a command line window as Administrator, which requires clicking Start typing cmd in the search box, then right-clicking on the cmd.exe program and choosing Run as Administrator. Next paste the following in the command line and hit enter.

reg delete "HKCU\SOFTWARE\Classes\Wow6432Node\CLSID\{B54F3741-5B07-11CF-A4B0-00AA004A55E8}" /f

That previous command removes the incorrect entry for VBScript support on 64-bit Windows 7. Next you need to make sure VBScript is properly registered by typing:

c:\windows\syswow64\regsvr32 vbscript.dll

or simply regsvr32 vbscript.dll if you are already in the syswow64 directory.

After following these steps you should be able to install your program without issue.

 

Fixing Error 2738 on Windows 7 32-bit

For a 32-bit operating system, the steps are similar, but slightly different:

Open a command line window as Administrator, by clicking Start, typing cmd in the search box, then right-clicking on the cmd.exe program and choosing Run as Administrator. Next paste the following in the command line and hit enter.

reg delete "HKCU\SOFTWARE\Classes\CLSID\{B54F3741-5B07-11CF-A4B0-00AA004A55E8}" /f

That previous command removes the incorrect entry for VBScript support on 32-bit Windows 7. Next you need to make sure VBScript is properly registered by typing:

c:\windows\system32\regsvr32 vbscript.dll

or simply regsvr32 vbscript.dll if you are already in the syswow64 directory.

After following these steps you should be able to install your program without issue on a 32-bit version of Windows 7.