Quantcast
Channel: Archives des PowerShell - dbi Blog
Viewing all 38 articles
Browse latest View live

syspolicy_purge_history job and PowerShell ExecutionPolicy

$
0
0

Since SQL Server 2008, Microsoft has introduced a system job called “syspolicy_purge_history”. This job is installed and enabled by default, and it contains three steps scheduled at 02:00 AM:

syspolicy_purge_history

syspolicy_purge_history_steps

 

syspolicy_purge_history_schedules

What is this job? And what is it for?

Since SQL Server 2008, a new feature called Policy Based Management has been added. When your policies are run, the results are stored in the msdb. But without a purge mechanism, msdb will keep growing. So Microsoft introduced the famous system job named “syspolicy_purge_history” to clean the results older than the days defined in the “HistoryRetentionInDays” property of Policy Management.

 

Should I disable it?

Definitively not. This job is part the well-functioning of SQL Server.

 

Should I care about it?

We do recommend to monitor this job as all other dba jobs.

 

But last time, this system job failed after each automatic and/or manual executions…

syspolicy_purge_history_execution_failed

If you go in details, you notice the third steps failed at the following line: “set –executionpolicy RemoteSigned –scope process –Force”

syspolicy_purge_history_detailed

 

Apparently, the SQL Server engine was not able to modify the execution policy to “RemoteSigned”. Does the engine have enough permissions? Should I modify manually the execution policy as Administrator?

Let’s see the current “ExecutionPolicy” configuration for the SQL Server PowerShell. Open the console from SQL Server Management Studio:

sql_server_powershell

And you have a beautiful error to begin softly:

sql_server_powershell_error

 

PowerShell executed the same command as previously, but with the same success…

Hopefully we have more details: Windows PowerShell tried to change the “ExecutionPolicy” parameter, but a policy have overridden the change…

 

Let’s see the “ExecutionPolicy” depending on the different scopes:

sql_server_powershell_executionpolicy

The “ExecutionPolicy” at the “MachinePolicy” scope is set to “AllSigned” (which is more restrictive), and may override the configuration at the “Process” scope. By setting the “ExecutionPolicy” to “RemoteSigned” at the “MachinePolicy” scope, the problem may be resolved…

sql_server_powershell_modify_executionpolicy

 

…or not! We cannot change manually the setting, even if the console is opened as Administrator!

But as you can see, we have more details: we cannot change the “ExecutionPolicy” this way, but we must change it through Group Policy.

Let’s open Local Group Policy Editor, and let’s browse to “Local Computer Policy\Computer Configuration\Administrative Templates\Windows Components\Windows PowerShell”.

Group_Policy

 

But no policy is configured…

 

If I cannot change the “ExecutionPolicy” at the “MachinePolicy” scope through the SQL Server PowerShell console, I propose you to change it directly in the registry 😉

sql_server_powershell_executionpolicy_registry

…the policy is correctly set to “RemoteSigned”, but this value is overridden from somewhere else… which is apparently not locally because I found no GPO configured… Someone must have set a GPO for the “ExecutionPolicy” on the Domain Controller…

 

And indeed, there is a GPO which affects all the servers of the organization:

GPO_AD

 

 

For this resolution, Microsoft proposes two workarounds in the KB2995870:

  • Create a New Organization Unit for this server…
  • Or simply disable this GPO…

 

To be honest, the domain administrator does not want to create a New Organization Unit for only one server. But he also does not want to disable this GPO for all the servers of the Organization.

 

Hopefully, it does not mean we are stuck 😉 Even if the Active Directory is able to redefine the Policy on all the servers in an Organization, the local computer can always have the last word!

I force the configuration of the “ExecutionPolicy” policy directly in the local registry where SQL Server is installed:

executionpolicy_local_policy

 

And the miracle happens:

sql_server_powershell_executionpolicy2

syspolicy_purge_history_execution_success

 

 

To conclude, I will simply and shortly say: Registry is my best friend 😉

 

Cet article syspolicy_purge_history job and PowerShell ExecutionPolicy est apparu en premier sur Blog dbi services.


SCOM: change group state to maintenance mode with PowerShell

$
0
0

Some weeks ago, I wrote a blog post about the creation of SCOM groups in order to subscribe to alerts. Subscribe to alert is mandatory, of course, to be able to receive alerts concerning our group. But during operations like an update, a patching, …, we don’t want to be spoiled by lots of alerts. To avoid those unexpected Emails, we need to place our group and so, objects contained in this group, in maintenance mode.

I will use a PowerShell script to do this job.
The parameter of my script will be:

  • ManagementServer: mandatory parameter containing management server name
  • GroupName: mandatory parameter containing display name of the target group
  • DurationTimeMin: mandatory parameter containing the duration maintenance time in minutes
  • Comment: mandatory parameter containing a comment for the maintenance time
  • Reason: mandatory parameter containing the reason of the maintenance, value are predifined and should be: PlannedOther, UnplannedOther, PlannedHardwareMaintenance, UnplannedHardwareMaintenance, PlannedHardwareInstallation, UnplannedHardwareInstallation, PlannedOperatingSystemReconfiguration,     UnplannedOperatingSystemReconfiguration, PlannedApplicationMaintenance, ApplicationInstallation, ApplicationUnresponsive, ApplicationUnstable, SecurityIssue, LossOfNetworkConnectivity

In PowerShell:

param(
 [Parameter(Mandatory=$true)][string]
 $ManagementServer,
 [Parameter(Mandatory=$true)][string]
 $GroupName,
 [Parameter(Mandatory=$true)][int32]
 $DurationTimeMin,
 [Parameter(Mandatory=$true)][string]
 $Reason,
 [Parameter(Mandatory=$true)][string]
 $Comment
 )

I need to import the necessary module for SCOM:

# Import necessary module for SCOM
 Import-Module OperationsManager

I will now create a persistent connection to my Management Group:

# Create connection to SCOM Management Group
 New-SCOMManagementGroupConnection -ComputerName $ManagementServer

 I have now just to find my SCOM group with his name and to place it in maintenance mode for the duration period I specified before:

# Find group and place in maintenance mode
ForEach ($Group in (Get-ScomGroup -DisplayName $GroupName))
    {
   If ($group.InMaintenanceMode -eq $false)
         {
            $group.ScheduleMaintenanceMode([datetime]::Now.touniversaltime(), `
            ([datetime]::Now).addminutes($DurationTimeMin).touniversaltime(), `
 
             "$Reason", "$Comment" , "Recursive")
         }
    }

To run my script I open a PowerShell screen and execute the following command:

MaintenanceMode4

I go now to SCOM and check for my group. I see that my group contains two SQL Server instances:

MaintenanceMode2

MaintenanceMode3

Those two instances are now in maintenance mode:

MaintenanceMode1

This simple script will be very practical to place group in maintenance mode and I will use it in a future blog post to schedule with PowerShell maintenance mode for SCOM groups.
See you soon 😉

Cet article SCOM: change group state to maintenance mode with PowerShell est apparu en premier sur Blog dbi services.

SCOM: schedule group maintenance task with PowerShell

$
0
0

In my last blog post, here, I spoke about how to place SCOM group in maintenance mode. This script is really interesting with an integration in Windows Task Scheduler. At the end, the main purpose is to plan a maintenance window of our different servers.

Let’s see how we can do that with PowerShell script.

First, I try to use the cmdlet Register-ScheduledTask, which can be used to register a scheduled task definition on a local computer. But, I quickly encountered a problem when I wanted to schedule my task every fourth Thursday of each month with the cmdlet New-ScheduledTaskTrigger. I was a little bit disappointed to see that it is not possible… Just daily and weekly recurring schedule are available… no monthly possibility… oups 😥

After a small search on the web, I found what I was looking for. I will create a COM object with the “schedule.service” ProgID (Programmatic IDentifier). It is an older method than the first solution I found but it guaranteed the possibility to schedule my task as I want.

First I need to provide some input parameters to my script:

  • TaskName: mandatory parameter containing the name ot the scheduled task
  • TaskDescr: mandatory parameter containing the description of the scheduled task
  • ManagementServer: mandatory parameter containing management server name
  • GroupName: mandatory parameter containing display name of the target group
  • DurationTimeMin: mandatory parameter containing the duration maintenance time in minutes of the maintenance task
  • Comment: mandatory parameter containing a comment for the maintenance task

Of course, I could define more parameters. Please, feel free to add more parameters to this script, according to your needs.
With PowerShell:

param(
[Parameter(Mandatory=$true)][string]
$TaskName,
[Parameter(Mandatory=$true)][string]
$TaskDescr,
[Parameter(Mandatory=$true)][string]
$ManagementServer,
[Parameter(Mandatory=$true)][string]
$GroupName,
[Parameter(Mandatory=$true)][string]
$DurationTimeMin, 
[Parameter(Mandatory=$true)][string]
$Comment
)

Now, I create my new Com object:

# Attach the Task Scheduler com object
$service = new-object -ComObject("Schedule.Service")

I connect it to my local machine:

# connect to the local machine.
$service.Connect()
$rootFolder = $service.GetFolder("\")

I define a new task which I enable, add my description coming from a parameter, allow to start my task on demand and enable the task as now:

$TaskDefinition = $service.NewTask(0)
$TaskDefinition.RegistrationInfo.Description = "$TaskDescr"
$TaskDefinition.Settings.Enabled = $true
$TaskDefinition.Settings.AllowDemandStart = $true
# Time when the task starts
$TaskStartTime = [datetime]::Now

My task will execute the script to set my SCOM group in maintenance mode. I will specify the path to my script, use PowerShell to execute my command and give arguments to my script:

# Task Action command
$TaskCommand = "powershell.exe"
# PowerShell script to be executed
$TaskScript = "C:\powershell\GroupMaintenanceMode.ps1"
# The Task Action command argument
$TaskArg = '-ExecutionPolicy Bypass "c:\powershell\GroupMaintenanceMode.ps1" -ManagementServer "' + $ManagementServer + '" -GroupName "''' + $GroupName + '''" -DurationTimeMin ' + $DurationTimeMin + ' -Reason "PlannedOther" -Comment "''' + $Comment + '''"'

At this step my task is now created, but it still needs a schedule time. You can find all the information about how to schedule a task in MSDN here. In my context, I will use create(5) to trigger the task every fourth Thursday of each month:

$triggers = $TaskDefinition.Triggers
$trigger = $triggers.Create(5)
$trigger.DaysOfWeek = '16'
$trigger.MonthsOfYear = '4095' #all month: j:1 f:2 m:4 a:8 m:16 j:32 j:64...
$trigger.WeeksOfMonth = '8'
$trigger.StartBoundary = $TaskStartTime.ToString("yyyy-MM-dd'T'HH:mm:ss"

To trigger my task one time at a specific time of day, I should use:

$trigger = $triggers.Create(1)
$trigger.StartBoundary = $TaskStartTime.ToString("yyyy-MM-dd'T'HH:mm:ss")

To trigger my task on a monthly schedule, every first day of the month, I can use:

$trigger = $triggers.Create(1)
$trigger.StartBoundary = $TaskStartTime.ToString("yyyy-MM-dd'T'HH:mm:ss")

All scheduling time is possible and have to be tried.
I create the action part of my schedule task, assign my command and the argument for my command:

$Action = $TaskDefinition.Actions.Create(0)
$action.Path = "$TaskCommand"
$action.Arguments = "$TaskArg"

It’s time to create my task. There are multiple possibilities to create the task, you can have a look here for more details.
I will create or update my task depending on whether or not it already exists. For information, I use the domain administrator credentials for the execution.

# 6: create or update the task if it exists
$rootFolder.RegisterTaskDefinition("$TaskName",$TaskDefinition,6,"ADSTS\Administrator","*********",1)

I open a PowerShell windows to execute my script with the following command:

.\ScheduleTask.ps1 -TaskName "Schedule Maintenance 1" -TaskDescr "Task schedule with PowerShell" –ManagementServer 'SCOM2012R2.adsts.local' -GroupName "SQL Server Production Instances" -DurationTimeMin "5" -Comment "Maintenance for Patching"

ScheduleTask1
My schedule task has been created with the action and trigger I mentioned earlier:

ScheduleTask2

ScheduleTask3

ScheduleTask4

ScheduleTask5

I  hope this blog post will help you managing your maintenance periods for SCOM groups.
Happy scripting 😉

 

Cet article SCOM: schedule group maintenance task with PowerShell est apparu en premier sur Blog dbi services.

Set the SQL Native Client Default Port with PowerShell

$
0
0

I written an article about “SQL Server 2012: Configuring your TCP Port via PowerShell” and I received a question from PaulJ:
“How do you set the port for the SQL Native Client 11.0 Configuration (32bit) – as seen in the SQL Configuration Manager?”

This is a very good question and I decide to write this blog as an answer to this question.
The first step is always the same, initialization of my object:

[reflection.assembly]::LoadWithPartialName("Microsoft.SqlServer.SqlWmiManagement") | Out-Null

$wmi = New-Object ("Microsoft.SqlServer.Management.Smo.Wmi.ManagedComputer")

The second step is used to know for which client protocol the setting belongs to.
In the class “Microsoft.SqlServer.Management.Smo.Wmi.ManagedComputer”, you find a property “ClientProtocols” as you can see in the msdn web page:
tcpnativeclient01
I display the name and the protocol Properties with this command:

$wmi.ClientProtocols | Select displayname -ExpandProperty ProtocolProperties

tcpnativeclient02
As you can see, I have 4 client protocols (Named Pipes, default port, KEEPALIVE and KEEPALIVEINTERVAL).
The next step is to select the default port:

$tcp_list = $wmi.ClientProtocols  | Where-Object {$_.displayname -eq "TCP/IP"}
$default_tcp = $tcp_list.ProtocolProperties | Where-Object {$_.Name -eq "Default Port"}
$default_tcp

tcpnativeclient03
As you can see, the default client port is set to 1433 and now, I will set another value for this port:

$default_tcp.value=50100

Note: The port has a System.Int32 type
Validate this change with an Alter:

$tcp_list.alter()

To finish, do not forget to restart your services to activate the port change:

$sql_service = ($wmi.Services | Where-Object { $_.Type -eq "SqlServer" })
$sql_service.alter()
$sql_service.stop()
$sql_service.start()

tcpnativeclient04
Et voilà! The default port for the client protocol is changed!

Cet article Set the SQL Native Client Default Port with PowerShell est apparu en premier sur Blog dbi services.

SQL Server 2016: New SQL PowerShell CMDLETs for ErrorLog

$
0
0

With the latest release of SQL Server Management Studio(SSMS) 2016 (13.0.15500.91), downloadable here, was introduced new CMDLETs for Always Encrypted, SQL Agent and the  ErrorLog.
SSMS_update01

In this article, I will present you the 2 new CMDLETs for the Error Logs:

  • Get-SqlErrorLog: Retrieves the SQL Server Logs.
  • Set-SqlErrorLog: Sets or resets the maximum number of error log files before recycling.

My first step is to search all commands with “Sql”:

Get-Command | Select Name |Where-Object {$_.Name -like "*Sql"*}

PowerShell_ErrorLog_01

As you can see, I have a lot of commands. I filter with SqlErrorLog and have the detail of both commands:

Get-Command | Where-Object {$_.Name -like "*SqlErrorLog*"} | Format-List *

PowerShell_ErrorLog_02

To have the detail per command, I use these commands:

Get-Command | Where-Object {$_.Name -eq "Get-SqlErrorLog"} | Format-List *
Get-Command | Where-Object {$_.Name -eq "Set-SqlErrorLog"} | Format-List *

 

CMDLET Get-SqlErrorLog

For Example, a simple query to retrieve all backup lines:

Get-SqlErrorLog | Where-Object { $_.text -like "*BACKUP*"} | Out-GridView

PowerShell_ErrorLog_03 You can do the same for the failed login:

Get-SqlErrorLog | Where-Object { $_.text -like "*Failed*"} | Out-GridView

PowerShell_ErrorLog_04

Or directly find all errors between 2 dates with –Before and –After parameters:

Get-SqlErrorLog -Before "2016/06/30" -After "2016/06/28" | Where-Object { $_.text -like "*Error:*"} | Out-GridView

PowerShell_ErrorLog_05 >

CMDLET Set-SqlErrorLog

It is very easy to configure the number of errorlog files with this command:

Set-SqlErrorLog -MaxLogCount [6-99]

PowerShell_ErrorLog_06

After the command:
PowerShell_ErrorLog_07

For the fun, I try to enter a value equal to 1 and a value equal to 100 to see if an error message appears:
PowerShell_ErrorLog_08

It is very nice to have these 2 news CMDLETs in SQL PowerShell 😉

Cet article SQL Server 2016: New SQL PowerShell CMDLETs for ErrorLog est apparu en premier sur Blog dbi services.

Generate Azure VM with Resource Manager deployment in PowerShell

$
0
0

Recently, there is a new way to manage the Azure infrastructure with Resource Manager. It brings many advantages regarding the classic deployment.
The differences between these two deployments will not be covered in this blog because it is not the initial goal, and it already exists a very good Microsoft topic on this subject.

In this blog, we will generate a new Windows Azure Virtual Machine using Resource Manager deployment with PowerShell from On-Premise.

Remember, only RM object can be listed with RM cmdlets! On the contrary, only Classic object can be listed with Classic cmdlets!

We can connect automatically to Azure Account with this command:
Select-AzureRmProfile -Path "C:\temp\AzureCert.json"

But to download this certificate, we need to connect manually to Azure Account at least once as follows:
Add-AzureRmAccount -SubscriptionId "<YourSubscriptionID>"

Enter your personal credentials and then run the following command:
Save-AzureRmProfile -Path "C:\temp\AzureCert.json"

If you want to navigate through your different attached Azure Subscriptions, use the cmdlets Get-AzureRmSubscription/Set-AzureRmSubcription.

To obtain the different existing Azure Locations:
Get-AzureRmLocation | Select DisplayName

For the end of this blog, we will work in this specific Azure Location:
$location = "West Europe"

Hardware Profile

To list all different available Resource Group:
Get-AzureRmResourceGroup | Select ResourceGroupName, Location

And select your specific Azure Resource Group:
$resourceGroupName = (Get-AzureRmResourceGroup).ResourceGroupName[0]

To choose the correct VM size, list all available Azure formats:
Get-AzureRmVMSize -location $location | Select Name, NumberOfCores, MemoryInMB
$vmSize = "Standard_A3"

And initialize the VM object to build:
$vm = New-AzureRMVMConfig -Name $vmname -VMSize $vmsize

Image Profile

Now we want to select a specific image available from a publisher in Azure. In this case, we will choose the last SQL Server 2016 Enterprise edition ISO.
The different steps will describe the method to find out all the elements to select the correct available image.

Select all publishers from a specific Azure Location:
Get-AzureRmVMImagePublisher -Location $location | Select PublisherName
$publisher = "MicrosoftSQLServer"

Now select all offers from a specific Azure Publisher:
Get-AzureRmVMImageOffer -Location $location -PublisherName $publisher | Select Offer
$offer = "SQL2016-WS2012R2"

Then select all Skus from a specific Azure Offer:
Get-AzureRmVMImageSku -Location $location -PublisherName $publisher -Offer $offer | Select Skus
$skus = "Enterprise"

Finally choose your version:
(Get-AzureRmVMImage -Location $location -PublisherName $publisher -Offer $publisher -Skus $skus).version

To obtain the last version of the image:
$Version = (Get-AzureRmVMImage -Location $location -PublisherName $publisher -Offer $offer -Skus $skus | sort -Descending).version[0]

Add the image profile to the existing VM object:
$vm = Set-AzureRmVMSourceImage -VM $vm -PublisherName $publisher -Offer $offer -Skus $skus -Version $version

OS Profile

According to the Image Profile, the Virtual Machine will be a Windows Server. So enter the specifications as follows:
$username = "dbi"
$password = ConvertTo-SecureString "B3stPa$$w0rd3v3r" -AsPlainText –Force
$cred = New-Object System.Management.Automation.PSCredential ($username, $password)
$vm = Set-AzureRmVMOperatingSystem -VM $VM -ComputerName "Artanis" -Windows -Credential $cred -ProvisionVMAgent

Disk Profile

As the VM will be created from an Azure Image, we need to specify a location and a name for the OS disk.

To list all your available Azure Storage Accounts, run this command:
Get-AzureRmStorageAccount | Select StorageAccountName, Location

To list the different containers available in your Azure Storage:
(Get-AzureRmStorageAccount | Get-AzureStorageContainer).CloudBlobContainer

And now add a disk profile to the existing VM:
$diskLocation = "https://<accountStorageName>.blob.core.windows.net/vhds/"
$vm = Set-AzureRmVMOSDisk -VM $vm -Name "artanisVHDOS.vhd" -VhdUri ($diskLocation+"artanisVHDOS.vhd") -CreateOption FromImage

IP Profile

Here is an example of Network configuration:
$subnet = New-AzureRmVirtualNetworkSubnetConfig -Name "CloudSubnet" -AddressPrefix "10.0.64.0/24"
$ain = New-AzureRmVirtualNetwork -Name "VirtualNetwork" -ResourceGroupName $resourceGroupName -Location $location -AddressPrefix "10.0.0.0/16" -Subnet $subnet
$pip = New-AzureRmPublicIpAddress -Name "AzurePublicIP" -ResourceGroupName $resourceGroupName -AllocationMethod Dynamic -Location $location
$nic = New-AzureRMNetworkInterface -Name "AzureNetInterface" -ResourceGroupName $resourceGroupName -Location $location SubnetId $ain.Subnets[0].Id -PublicIpAddressId $pip.Id

Conclusion: VM generation

Now we have entered all different profiles required to generate a new Windows Azure VM:
$azurevm = New-AzureRmVM -ResourceGroupName $resourceGroupName -Location $location -VM $vm

Use “Get-AzureRmVM” cmdlet to list all available VMs.

To download the remote desktop file to connect to this new virtual machine, use the following command:
Get-AzureRmRemoteDesktopFile -ResourceGroupName $resourceGroupName -Name $vmName -LocalPath "C:\Temp\Artanis.rdp"

With all these commands, you can realize how simple it is to automate the generation of a new Virtual Machine in Azure. Moreover you should probably have noticed the construction of the VM object (with the different profiles) is similar to Hyper-V structure.

I hope it helps you 😉

Cet article Generate Azure VM with Resource Manager deployment in PowerShell est apparu en premier sur Blog dbi services.

Manage Azure in PowerShell (RM)

$
0
0

Azure offers two deployment models for cloud components: Resource Manager (RM) and Classic deployment model. Newer and more easier to manage, Microsoft recommends to use the Resource Manager.
Even if these two models can exist at the same time in Azure, they are different and managed differently: in PowerShell cmdlets are specific to RM.

In order to be able to communicate with Azure from On-Premises in PowerShell, you need to download and install the Azure PowerShell from WebPI. For more details, please refer to this Microsoft Azure post “How to install and configure Azure PowerShell“.
 
 
Azure PowerShell installs many modules located in C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell:
Get-module -ListAvailable -Name *AzureRm*
ModuleType Version Name ExportedCommands
---------- ------- ---- ----------------
Manifest 1.1.3 AzureRM.ApiManagement {Add-AzureRmApiManagementRegion, Get-AzureRmApiManagementSsoToken, New-AzureRmApiManagementHostnam...
Manifest 1.0.11 AzureRM.Automation {Get-AzureRMAutomationHybridWorkerGroup, Get-AzureRmAutomationJobOutputRecord, Import-AzureRmAutom...
Binary 0.9.8 AzureRM.AzureStackAdmin {Get-AzureRMManagedLocation, New-AzureRMManagedLocation, Remove-AzureRMManagedLocation, Set-AzureR...
Manifest 0.9.9 AzureRM.AzureStackStorage {Add-ACSFarm, Get-ACSEvent, Get-ACSEventQuery, Get-ACSFarm...}
Manifest 1.0.11 AzureRM.Backup {Backup-AzureRmBackupItem, Enable-AzureRmBackupContainerReregistration, Get-AzureRmBackupContainer...
Manifest 1.1.3 AzureRM.Batch {Remove-AzureRmBatchAccount, Get-AzureRmBatchAccount, Get-AzureRmBatchAccountKeys, New-AzureRmBatc...
Manifest 1.0.5 AzureRM.Cdn {Get-AzureRmCdnCustomDomain, New-AzureRmCdnCustomDomain, Remove-AzureRmCdnCustomDomain, Get-AzureR...
Manifest 0.1.2 AzureRM.CognitiveServices {Get-AzureRmCognitiveServicesAccount, Get-AzureRmCognitiveServicesAccountKey, Get-AzureRmCognitive...
Manifest 1.3.3 AzureRM.Compute {Remove-AzureRmAvailabilitySet, Get-AzureRmAvailabilitySet, New-AzureRmAvailabilitySet, Get-AzureR...
Manifest 1.0.11 AzureRM.DataFactories {Remove-AzureRmDataFactory, Get-AzureRmDataFactoryRun, Get-AzureRmDataFactorySlice, Save-AzureRmDa...
Manifest 1.1.3 AzureRM.DataLakeAnalytics {Get-AzureRmDataLakeAnalyticsDataSource, Remove-AzureRmDataLakeAnalyticsCatalogSecret, Set-AzureRm...
Manifest 1.0.11 AzureRM.DataLakeStore {Add-AzureRmDataLakeStoreItemContent, Export-AzureRmDataLakeStoreItem, Get-AzureRmDataLakeStoreChi...
Manifest 1.0.2 AzureRM.DevTestLabs {Get-AzureRmDtlAllowedVMSizesPolicy, Get-AzureRmDtlAutoShutdownPolicy, Get-AzureRmDtlAutoStartPoli...
Manifest 1.0.11 AzureRM.Dns {Get-AzureRmDnsRecordSet, New-AzureRmDnsRecordConfig, Remove-AzureRmDnsRecordSet, Set-AzureRmDnsRe...
Manifest 1.1.3 AzureRM.HDInsight {Get-AzureRmHDInsightJob, New-AzureRmHDInsightSqoopJobDefinition, Wait-AzureRmHDInsightJob, New-Az...
Manifest 1.0.11 AzureRM.Insights {Add-AzureRmMetricAlertRule, Add-AzureRmLogAlertRule, Add-AzureRmWebtestAlertRule, Get-AzureRmAler...
Manifest 1.1.10 AzureRM.KeyVault {Get-AzureRmKeyVault, New-AzureRmKeyVault, Remove-AzureRmKeyVault, Remove-AzureRmKeyVaultAccessPol...
Manifest 1.0.7 AzureRM.LogicApp {Get-AzureRmIntegrationAccountAgreement, Get-AzureRmIntegrationAccountCallbackUrl, Get-AzureRmInte...
Manifest 0.9.2 AzureRM.MachineLearning {Export-AzureRmMlWebService, Get-AzureRmMlWebServiceKeys, Import-AzureRmMlWebService, Remove-Azure...
Manifest 1.0.12 AzureRM.Network {Add-AzureRmApplicationGatewayBackendAddressPool, Get-AzureRmApplicationGatewayBackendAddressPool,...
Manifest 1.0.11 AzureRM.NotificationHubs {Get-AzureRmNotificationHubsNamespaceAuthorizationRules, Get-AzureRmNotificationHubsNamespaceListK...
Manifest 1.0.11 AzureRM.OperationalInsights {Get-AzureRmOperationalInsightsSavedSearch, Get-AzureRmOperationalInsightsSavedSearchResults, Get-...
Manifest 1.0.0 AzureRM.PowerBIEmbedded {Remove-AzureRmPowerBIWorkspaceCollection, Get-AzureRmPowerBIWorkspaceCollection, Get-AzureRmPower...
Manifest 1.0.11 AzureRM.Profile {Enable-AzureRmDataCollection, Disable-AzureRmDataCollection, Remove-AzureRmEnvironment, Get-Azure...
Manifest 1.1.3 AzureRM.RecoveryServices {Get-AzureRmRecoveryServicesBackupProperties, Get-AzureRmRecoveryServicesVault, Get-AzureRmRecover...
Manifest 1.0.3 AzureRM.RecoveryServices.Backup {Backup-AzureRmRecoveryServicesBackupItem, Get-AzureRmRecoveryServicesBackupManagementServer, Get-...
Manifest 1.1.9 AzureRM.RedisCache {Reset-AzureRmRedisCache, Export-AzureRmRedisCache, Import-AzureRmRedisCache, Remove-AzureRmRedisC...
Manifest 2.0.2 AzureRM.Resources {Get-AzureRmADApplication, Get-AzureRmADGroupMember, Get-AzureRmADGroup, Get-AzureRmADServicePrinc...
Manifest 1.0.2 AzureRM.ServerManagement {Install-AzureRmServerManagementGatewayProfile, Reset-AzureRmServerManagementGatewayProfile, Save-...
Manifest 1.1.10 AzureRM.SiteRecovery {Stop-AzureRmSiteRecoveryJob, Get-AzureRmSiteRecoveryNetwork, Get-AzureRmSiteRecoveryNetworkMappin...
Manifest 1.0.11 AzureRM.Sql {Get-AzureRmSqlDatabaseImportExportStatus, New-AzureRmSqlDatabaseExport, New-AzureRmSqlDatabaseImp...
Manifest 1.1.3 AzureRM.Storage {Get-AzureRmStorageAccount, Get-AzureRmStorageAccountKey, Get-AzureRmStorageAccountNameAvailabilit...
Manifest 1.0.11 AzureRM.StreamAnalytics {Get-AzureRmStreamAnalyticsFunction, Get-AzureRmStreamAnalyticsDefaultFunctionDefinition, New-Azur...
Manifest 1.0.11 AzureRM.Tags {Remove-AzureRmTag, Get-AzureRmTag, New-AzureRmTag}
Manifest 1.0.11 AzureRM.TrafficManager {Disable-AzureRmTrafficManagerEndpoint, Enable-AzureRmTrafficManagerEndpoint, Set-AzureRmTrafficMa...
Manifest 1.0.11 AzureRM.UsageAggregates Get-UsageAggregates
Manifest 1.1.3 AzureRM.Websites {Get-AzureRmAppServicePlanMetrics, New-AzureRmWebAppDatabaseBackupSetting, Restore-AzureRmWebAppBa...

 
The basic cmdlets to connect and navigate between your different Accounts or Subscriptions are located in “AzureRM.Profile” module:
PS C:\> Get-Command -Module AzureRM.Profile
CommandType Name Version Source
----------- ---- ------- ------
Alias Login-AzureRmAccount 1.0.11 AzureRM.Profile
Alias Select-AzureRmSubscription 1.0.11 AzureRM.Profile
Cmdlet Add-AzureRmAccount 1.0.11 AzureRM.Profile
Cmdlet Add-AzureRmEnvironment 1.0.11 AzureRM.Profile
Cmdlet Disable-AzureRmDataCollection 1.0.11 AzureRM.Profile
Cmdlet Enable-AzureRmDataCollection 1.0.11 AzureRM.Profile
Cmdlet Get-AzureRmContext 1.0.11 AzureRM.Profile
Cmdlet Get-AzureRmEnvironment 1.0.11 AzureRM.Profile
Cmdlet Get-AzureRmSubscription 1.0.11 AzureRM.Profile
Cmdlet Get-AzureRmTenant 1.0.11 AzureRM.Profile
Cmdlet Remove-AzureRmEnvironment 1.0.11 AzureRM.Profile
Cmdlet Save-AzureRmProfile 1.0.11 AzureRM.Profile
Cmdlet Select-AzureRmProfile 1.0.11 AzureRM.Profile
Cmdlet Set-AzureRmContext 1.0.11 AzureRM.Profile
Cmdlet Set-AzureRmEnvironment 1.0.11 AzureRM.Profile

According to the cmdlets present in “AzureRM.Profile” module, you will be able to connect to your Azure Account(enter your credentials):
PS C:\> Login-AzureRmAccount
Environment : AzureCloud
Account : n.courtine@xxxxxx.com
TenantId : a123456b-789b-123c-4de5-67890fg123h4
SubscriptionId : z123456y-789x-123w-4vu5-67890ts123r4
SubscriptionName : ** Subscription Name **
CurrentStorageAccount :

 
You can list your associated Azure Subscriptions:
Get-AzureRmSubscription
SubscriptionName : ** Subscription Name **
SubscriptionId : z123456y-789x-123w-4vu5-67890ts123r4
TenantId : a123456b-789b-123c-4de5-67890fg123h4

 
To switch your Subscription, do as follows:
Select-AzureRmSubscription -SubscriptionId z123456y-789x-123w-4vu5-67890ts123r4
Environment : AzureCloud
Account : n.courtine@xxxxxx.com
TenantId : a123456b-789b-123c-4de5-67890fg123h4
SubscriptionId : z123456y-789x-123w-4vu5-67890ts123r4
SubscriptionName : ** Subscription Name **
CurrentStorageAccount :

 
Or you can take a specific “snapshot” of your current location in Azure. It will help you to easily return to a specific context at the moment you ran the command:
PS C:\> $context = Get-AzureRmContext
Environment : AzureCloud
Account : n.courtine@xxxxxx.com
TenantId : a123456b-789b-123c-4de5-67890fg123h4
SubscriptionId : z123456y-789x-123w-4vu5-67890ts123r4
SubscriptionName : ** Subscription Name **
CurrentStorageAccount :
...
PS C:\> Set-AzureRmContext -Context $context
Environment : AzureCloud
Account : n.courtine@xxxxxx.com
TenantId : a123456b-789b-123c-4de5-67890fg123h4
SubscriptionId : z123456y-789x-123w-4vu5-67890ts123r4
SubscriptionName : ** Subscription Name **
CurrentStorageAccount :

 
It is also possible to list all the available Storage Account associated to your current subscriptions:
PS C:\> Get-AzureRmStorageAccount | Select StorageAccountName, Location
StorageAccountName Location
------------------ --------
semicroustillants259 westeurope
semicroustillants4007 westeurope
semicroustillants8802 westeurope

 
To see the existing blob container in each Storage Account:
PS C:\> Get-AzureRmStorageAccount | Select StorageAccountName, ResourceGroupName, Location
Blob End Point: https://dbimssql.blob.core.windows.net/
Name Uri LastModified
---- --- ------------
bootdiagnostics-t... https://dbimssql.blob.core.windows.net/bootdiagnostics-ta... 30.09.2016 12:36:12 +00:00
demo https://dbimssql.blob.core.windows.net/demo 05.10.2016 14:16:01 +00:00
vhds https://dbimssql.blob.core.windows.net/vhds 30.09.2016 12:36:12 +00:00
Blob End Point: https://semicroustillants259.blob.core.windows.net/
Name Uri LastModified
---- --- ------------
mastervhds https://semicroustillants259.blob.core.windows.net/master... 28.09.2016 13:41:19 +00:00
uploads https://semicroustillants259.blob.core.windows.net/uploads 28.09.2016 13:41:19 +00:00
vhds https://semicroustillants259.blob.core.windows.net/vhds 28.09.2016 13:55:57 +00:00
Blob End Point: https://semicroustillants4007.blob.core.windows.net/
Name Uri LastModified
---- --- ------------
artifacts https://semicroustillants4007.blob.core.windows.net/artif... 28.09.2016 13:59:47 +00:00

Azure infrastructure can be easily managed from On-Premises in PowerShell. In a previous post, I explained how to deploy a Virtual Machine from an Image in Azure PowerShell.
If you have remarks or advises, do not hesitate to share 😉

Cet article Manage Azure in PowerShell (RM) est apparu en premier sur Blog dbi services.

Live from SQL Saturday Slovenia 2016!

$
0
0

SQL Saturday

After a little trip, just 1-hour flying from Zürich to Ljubljana yesterday, the SQL Saturday Slovenia 2016 begins this morning at the Faculty of Computer and Information Science of the University of Ljubljana

IMG_3774

I needed to wake up very fast because my session was the first of the day at 9:00 AM.

IMG_3775

I also very happy to meet and to share my expertise with Slovenian and other SQL Server experts.

My session was about the famous ErrorLog.

As a DBA, the Error log is an essential daily tool in our life.

Learning and understanding its content are not the last part of the job, we also have to manage it to obtain a better interpretation.

This session is to answer of questions that you perhaps never ask yourself:

  • What are ErrorLog files?
  • Where are ErrorLog files?
  • Need I manage the ErrorLog?
  • How to read and understand?
  • How to write in the ErrorLog?

IMG_3776

I hope that people have learn something on this session!

You can download the presentation here.

I will thank all organizers of this very nice event and a special thanks to Vedran for these photos!

Now, I will also go to see the others sessions. 😉

I give you « rendez-vous » in the IT-Tage in Tuesday for this session in german!

Cet article Live from SQL Saturday Slovenia 2016! est apparu en premier sur Blog dbi services.


Pass Summit 2017

$
0
0

Today starts the Pass Summit 2017 taking place in Seattle.
After a small fly over the Ocean, more than 10 hours… yesterday, and a nice jet lag which avoid me to sleep later than 4AM this morning, I arrived to the Convention Center in Seattle where the Pass takes place.

IMG_9474[1]

I start this first day by the session of Itzik Ben-Gan: T-SQL Tips and Tricks.
As part of the session, Itzik spoke about batch processing (start with 2012) which boost the execution of T_SQL script compare to Row execution mode.
The problem is that Batch mode is just available with columnstore indexes. So if you don’t have a columnstore index in your table you cannot benefit of this feature.
To cheat this drawback Itzik showed us the possibility to create a filter columnstore index (filter CI start with 2016) which will return no row but will enable the possibility to use batch processing.
Well done!

After a quick lunch, I continue this first day by the Session of Drew Furgiuele:

PowerShell

After having explained why to use PowerShell (automation, bridge between tools…) and how to install the SQLSERVER module (Install-Module SQLSERVER or Save-Module SQLServer), Drew shown how to use this module.
The first interesting point is how to browse SQL Server once the module has been installed.
For that just execute the PS script:

cd SQLSERVER:\

And after connection to your SQL Server instance with cd sql\<servername>\default for a SQL Server default instance or \<instancename> for a named instance it’s possible to browse your complete instance as you can do via SQL Server Management Studio with commands like:

$dbs = Get-Item
$dbs = Get-Item ¦ where-object {$_.name -eq AdventureWorks2104}

Easy for a fist step with PowerShell.
Of course Drew showed us really more with PowerShell scripts copying tables from an instance to an other one, managing backups identically in your whole environment or executing a Point in time restore.
Well done Drew.

The last session of the day as 2 parts and is driven by Glenn Berry about Migration to SQL Server 2017.
Glenn explained that there is plenty Reasons to upgrade to SQL Server 2017: great new features, features available with Standard Edition (start with 2016 SP1)…
But he also pointed that there is also big performance differences between Standard and Enterprise Edition with examples using columnstore indexes or when running a dbcc checkdb.
So it’s not just new features that are available with Enterprise Edition, it could also provide great performance gain which is often forgotten.
There is also limitation for memories, sockets and physical cores usage with Standard Edition, don’t build a Virtual Machine for a Standard Edition with too many memories or sockets/cores because it will not be able to use them 😉 You can learn more on Glenn Berry’s blog.

This first day was very great with lot’s of interesting sessions.
It’s time now to visit a little bit Seattle and waiting tomorrow for the second day with some other great sessions and speakers!

 

Cet article Pass Summit 2017 est apparu en premier sur Blog dbi services.

How To Deploy Office Web Apps Server 2013

$
0
0

The 4 Steps Of Office Web Apps Server 2013 Installation

Office Web Apps provides browser-based versions of Excel, One Note, Word and PowerPoint. It also helps users who access files through SharePoint 2013.

The objective of this topic is to define the steps to install office web apps 2013, create the farm and the binding so that it can be used within SharePoint 2013 test environment.

For this example, we have the following systems in place:

  • Windows Server 2012 r2
  • SharePoint Server 2013

1) Install Server roles, features & Role services

Server roles:

  • Web server

Features:

  • Ink and Handwriting services

Role services:

  • Dynamic Content Compression
  • Windows Authentication
  • .Net Extensibility 4.5
  • ASP.Net 4.5
  • Server Side Includes

Restart the server.

Note that if your installation is done on Windows Server 2016, the feature “Ink and Handwriting services” is now a default part of the server and no longer requires a separate package.

2) Install Office Web Apps

Launch the setup from the DVD file and wait until the installation is finished.

3) Create Office Web Apps Farm

1) Specify the internal URL for the server name
2) Use administrative privileges
3) run the Power Shell command “New-OfficeWebAppsFarm -InternalURL http://servername -AllowHttp -EditingEnabled”

This command allows HTTP as it is internal and the function enable editing to allow users to edit documents.

To verify that the farm is successfully created, type in the browser the URL “http://servername/hosting/delivery”.

4) Bind Office Web Apps and SharePoint

The communication between both sides still need to be done through HTTP protocol.

1) Use administrative privileges
2) Switch over SharePoint management shell
3) Run the command “New-SPWOPIBinding -ServerName servername -AllowHTTP”

The command should return that HTTP protocol is used internally and a list of bindings.

Check SharePoint default internal zone:

Get-SPWOPIZone

If it is HTTPS, change it into HTTP:

Set-SPWOPIZone -Zone internal-http

Set the authentication OAuth over HTTP to true:

  • $config = (Get-SPSecurityTokenServiceConfig)
  • $config.AllowOAuthOverHttp = $true
  • $config.update()

SharePoint can now use Office Web Apps.

To avoid errors, few points need to be verify before testing Office Web apps within SharePoint:

a) Check SharePoint authentication mode (claims-based and not classic) using PowerShell:

  • $WebApp=”http://webapp/”
  • (Get-SPWebApplication $WebAppURL).UseClaimsAuthentication

b) Check that the login account is not a system account but a testing account.

c) Enabling editing Office Web Apps, if it is false, set it to true using the PowerShell command:

  • Set-OfficeWebAppsFarm -EditingEnabled:$true

d) Check that Office Web Apps has enough memory

Need help, more details can be found on here.

Cet article How To Deploy Office Web Apps Server 2013 est apparu en premier sur Blog dbi services.

SQL Server – Collecting last backup information in an AlwaysOn environment

$
0
0

Introduction

Sometimes you face interesting challenges with unusual environment. One of my customer needed a automated and flexible backup solution. Said like that nothing very complex you will say. But if I mention that some databases were 60TB big with more than 30 filegroups and around 600 database data files each and moreover synchronized in an AlwayOn availability group, it is not the same story and you can easily imagine that working with standard backup strategy will not be viable. Therefore I was working on implementing solution using partial full, partial differential and read-only filegroups backups to minimize the time needed.
Well this post is not explaining the whole solution, but only a way to collect the last backup information of my databases, especially for the ones being in an AlwaysOn availability group and which filegroup states changed.
If you already worked with partial backups and read-only filegroups backups you know that the backup sequence is very important, but if you don’t you will quickly notice it if you need to restore, and you can easily understand why this last backup information is crucial. As the backups always have to run on the primary replica, you have to collect the information on all replicas if failover occurred and the primary changed to ensure that you execute the right backups at the right moment and not make unnecessary backups (remember the data volumes).

 

Explanation of the solution and code

Another thing to mentioned, because of security policies, it was forbidden to use linked server, but hopefully xp_CmdShell was possible. I wanted each replica to work independently, and needed a way to query the remote replicas to collect the last backup information on each SQL Server instances involved. Because backup history might be cleans, I need to store this information in local tables. I created 2 tables, one to stored last database backups information and the other to store last read-only filegroup backups information. Additionally I created 2 tables to collect temporarily the information coming from all replicas.

Creation of the last backup information tables:

--########################################################
--###Backup generator - backup last date info temporary table
--########################################################

USE [<YourDatabaseName>]
GO
/*
if OBJECT_ID('[dbo].[bakgen_backuplastdt_databases_temp]') is not null
	drop table [dbo].[bakgen_backuplastdt_databases_temp]
*/
create table [dbo].[bakgen_backuplastdt_databases_temp] (
	ServerName sysname not null,
	SqlInstanceName sysname  not null,
	SqlServerName sysname  not null,
	ServiceBrokerGuid uniqueidentifier not null,
	DatabaseCreationDate datetime  not null,
	DatabaseName sysname  not null,
	BackupType char(1) not null,
	LastBackupDate datetime  not null,
	LastBackupSize numeric(20,0) not null,
	is_primary bit null,
	insertdate datetime  not null
)
GO
create unique clustered index idx_bakgen_backuplastdt_databases_temp on [dbo].[bakgen_backuplastdt_databases_temp](DatabaseCreationDate,DatabaseName,BackupType,ServerName,SqlInstanceName)



--########################################################
--###Backup generator - backup last date info
--########################################################

USE [<YourDatabaseName>]
GO
/*
if OBJECT_ID('[dbo].[bakgen_backuplastdt_databases]') is not null
	drop table [dbo].[bakgen_backuplastdt_databases]
*/
create table [dbo].[bakgen_backuplastdt_databases] (
	ServerName sysname  not null,
	SqlInstanceName sysname  not null,
	SqlServerName sysname  not null,
	ServiceBrokerGuid uniqueidentifier not null,
	DatabaseCreationDate datetime  not null,
	DatabaseName sysname  not null,
	BackupType char(1) not null,
	LastBackupDate datetime  not null,
	LastBackupSize numeric(20,0) not null,
	is_primary bit null,
	insertdate datetime  not null
)
GO
create unique clustered index idx_bakgen_backuplastdt_databases on [dbo].[bakgen_backuplastdt_databases](DatabaseCreationDate,DatabaseName,BackupType,ServerName,SqlInstanceName)

I finally decided to work with a stored procedure calling a PowerShell scripts to remotely execute the queries on the replicas.
The stored procedure lists the existing replicas and collects the last database backup information, then the read-only filegroup backup information creating 2 different queries to execute locally on the server and store the data in the temp tables first. It will create similar queries, excluding the databases not involved in availability groups and execute them on the remote replicas using xp_CmdShell running PowerShell scripts. The PowerShell scripts are dynamically created using the TSQL queries generated. They used one function of the well-known DBATools. So you will have to install it first.
You will notice that in order to log the scripts generated are nicely formatted in order to read and debug them easier. But before executing you PowerShell script through xp_CmdShell you need to apply some string formatting like the 2 lines I added to avoid the execution to fail:

set @PSCmd = replace(replace(@PSCmd, nchar(13), N”), nchar(10), N’ ‘)
set @PSCmd = replace(@PSCmd, ‘>’, N’^>’)

Do not forget to escape some characters, otherwise the execution will fails, in my case omitting to escape the ‘>’ sign raise an “Access is denied” message in the output of the xp_CmdShell execution.

After that the code is comparing what has been collected in the temp tables with the final information and update information if needed.

Here is the complete code of the stored procedure:

use [<YourDatabaseName>]
if OBJECT_ID('dbo.bakgen_p_getbakinfo') is not null
            drop procedure dbo.bakgen_p_getbakinfo 
go

CREATE PROCEDURE dbo.bakgen_p_getbakinfo 
AS
/************************************
*   dbi-services SA, Switzerland    *
*   http://www.dbi-services.com        *
*************************************
    Group/Privileges..: DBA
    Script Name......:       bakgen_p_getbakinfo.sql
    Author...........:          Christophe Cosme
    Date.............:           2019-09-20
    Version..........:          SQL Server 2016 / 2017
    Description......:        Get the backup information locally but also on the replica involved

    Input parameters.: 

            Output parameter: 
                                               
    Called by........:         Stored Procdedure : [dbo].[bakgen_p_bakexe]
************************************************************************************************
    Historical
    Date        Version    Who    Whats                  Comments
    ----------  -------    ---    --------    -----------------------------------------------------
    2019-09-30  1.0        CHC    Creation
************************************************************************************************/ 
BEGIN 

BEGIN TRY
            
            set nocount on

            declare 
    @ErrorMessage  NVARCHAR(4000), 
    @ErrorSeverity INT, 
    @ErrorState    INT;

            declare @ModuleName sysname,
                                    @ProcName sysname,
                                    @InfoLog nvarchar(max),
                                    @Execute char(1)
                        
            set @ModuleName = 'BakGen'
            set @ProcName = OBJECT_NAME(@@PROCID)
            set @Execute = 'A'

            set @InfoLog = 'Retrieve backup information'
            execute dbo.bakgen_p_log       
                        @ModuleName = @ModuleName,
                        @ProcedureName = @ProcName,
                        @ExecuteMode = @Execute,
                        @LogType = 'INFO',
                        @DatabaseName = null,
                        @Information = @InfoLog,
                        @Script = null


            --###variable to store error message
            declare @errmsg varchar(4000)
            --###variable with the current datetime
            declare @cdt datetime = getdate()

            --###variabler to store the sql and powershell commands to execute
            declare @sqllocalDB nvarchar(4000),
                                    @sqllocalFG nvarchar(4000),
                                    @sqlremoteDB nvarchar(4000),
                                    @sqlremoteFG nvarchar(4000),
                                    @PSCmd nvarchar(4000)

            --###variable to store the local SQL server name
            declare @LocalSqlServerName sysname
            --###variable to store the list of replicas
            declare @TAgReplica table (AgReplicaName sysname)
            --###variable for the cursors
            declare @AgReplicaName sysname

            --###set the local SQL Server name
            set @LocalSqlServerName = lower(convert(sysname,serverproperty('ServerName')))
                        

            --############################################################################
            --### check if tables exist
            --############################################################################
            if object_id('[dbo].[bakgen_backuplastdt_databases_temp]') is null
            begin
                        set @errmsg = 'Get Backup info : table not found'
                        set @errmsg += '          table name = [dbo].[bakgen_backuplastdt_databases_temp]' 
                        raiserror (@errmsg,11,1);
            end
            if object_id('[dbo].[bakgen_backuplastdt_fgreadonly_temp]') is null
            begin
                        set @errmsg = 'Get Backup info : table not found'
                        set @errmsg += '          table name = [dbo].[bakgen_backuplastdt_fgreadonly_temp]' 
                        raiserror (@errmsg,11,1);                      
            end

            if object_id('[dbo].[bakgen_backuplastdt_databases]') is null
            begin
                        set @errmsg = 'Get Backup info : table not found'
                        set @errmsg += '          table name = [dbo].[bakgen_backuplastdt_databases]' 
                        raiserror (@errmsg,11,1);
            end
            if object_id('[dbo].[bakgen_backuplastdt_fgreadonly]') is null
            begin
                        set @errmsg = 'Get Backup info : table not found'
                        set @errmsg += '          table name = [dbo].[bakgen_backuplastdt_fgreadonly]' 
                        raiserror (@errmsg,11,1);                      
            end


            
            --############################################################################
            --### select the replicas involved adding first the local server
            --############################################################################
            insert into @TAgReplica (AgReplicaName ) select @LocalSqlServerName

            --###check if alwayson feature is activated
            if (serverproperty('IsHadrEnabled') = 1)
            begin
                        insert into @TAgReplica (AgReplicaName )
                        select lower(agr.replica_server_name) from sys.availability_replicas agr
                                    where agr.replica_server_name <> @LocalSqlServerName
            end


            --############################################################################
            --### construct the SQL command to execute on the local SQL Server
            --############################################################################
            set @sqllocalDB = ''
            set @sqllocalDB +='

            declare @Tbi table (
                        ServerName sysname,
                        SqlInstanceName sysname,
                        SqlServerName sysname,
                        ServiceBrokerGuid uniqueidentifier,
                        DatabaseCreationDate datetime,
                        DatabaseName sysname,
                        BackupType char(1),
                        LastBackupDate datetime,
                        is_primary bit null,
                        insertdate datetime       
            )


            insert into @Tbi (
                        [ServerName],
                        [SqlInstanceName],
                        [SqlServerName],
                        [ServiceBrokerGuid],
                        [DatabaseCreationDate],
                        [DatabaseName],
                        [BackupType],
                        [LastBackupDate],
                        [is_primary],
                        [insertdate])
            select  
                        lower(convert(sysname,serverproperty(''machinename''))) as ServerName,
                        lower(convert(sysname,serverproperty(''InstanceName''))) as SqlInstanceName,
                        lower(convert(sysname,serverproperty(''ServerName''))) as SqlServerName,
                        db.service_broker_guid as ServiceBrokerGuid,
                        db.create_date as DatabaseCreationDate,
                        bs.database_name as DatabaseName,
                        bs.type as BackupType,
                        max(bs.backup_finish_date) as LastBackupDate,
                        sys.fn_hadr_is_primary_replica(bs.database_name) as is_primary,
                        ''' + convert(varchar,@cdt,120) + '''   
            from msdb.dbo.backupset bs
                        inner join sys.databases db on db.name = bs.database_name
                        where bs.type in (''D'',''I'',''P'',''Q'')
                                    and bs.is_copy_only = 0
                                    and coalesce(sys.fn_hadr_is_primary_replica(bs.database_name),-1) in (-1,0,1)
                        group by
                                    db.service_broker_guid,
                                    db.create_date,
                                    bs.database_name,
                                    bs.type, 
                                    sys.fn_hadr_is_primary_replica(bs.database_name)

            insert into [dbo].[bakgen_backuplastdt_databases_temp] (
                        [ServerName],
                        [SqlInstanceName],
                        [SqlServerName],
                        [ServiceBrokerGuid],
                        [DatabaseCreationDate],
                        [DatabaseName],
                        [BackupType],
                        [LastBackupDate],
                        [LastBackupSize],
                        [is_primary],
                        [insertdate])
            select  
                        t.[ServerName],
                        t.[SqlInstanceName],
                        t.[SqlServerName],
                        t.[ServiceBrokerGuid],
                        t.[DatabaseCreationDate],
                        t.[DatabaseName],
                        t.[BackupType],
                        t.[LastBackupDate],
                        bs.[backup_size],
                        t.[is_primary],
                        t.[insertdate]
            from @Tbi t
                        inner join msdb.dbo.backupset bs on 
                                    bs.backup_finish_date = t.LastBackupDate  
                                    and bs.database_name collate database_default = t.DatabaseName collate database_default
                                    and bs.type collate database_default = t.BackupType collate database_default
'




            set @sqllocalFG = ''
            set @sqllocalFG +='

            insert into [dbo].[bakgen_backuplastdt_fgreadonly_temp]
           ([ServerName],
           [SqlInstanceName],
           [SqlServerName],
                           [ServiceBrokerGuid],
                           [DatabaseCreationDate],
           [DatabaseName],
           [BackupType],
           [filegroup_name],
           [file_logicalname],
           [filegroup_guid],
           [file_guid],
           [LastBackupDate],
                           [LastBackupReadOnlyLsn],
           [is_primary],
                           [insertdate])
            select  
                        lower(convert(sysname,serverproperty(''machinename''))) as ServerName,
                        lower(convert(sysname,serverproperty(''InstanceName''))) as SqlInstanceName,
                        lower(convert(sysname,serverproperty(''ServerName''))) as SqlServerName,
                        db.service_broker_guid as ServiceBrokerGuid,
                        db.create_date as DatabaseCreationDate,
                        bs.database_name as DatabaseName,
                        bs.type as BackupType,
                        bf.filegroup_name,
                        bf.logical_name as file_logicalname,
                        bf.filegroup_guid,
                        bf.file_guid,
                        max(bs.backup_finish_date) as LastBackupDate,
                        max(bf.read_only_lsn) as LastBackupReadOnlyLsn,
                        sys.fn_hadr_is_primary_replica(bs.database_name) as is_primary, 
                        ''' + convert(varchar,@cdt,120) + '''   
            from msdb.dbo.backupset bs
                                    inner join msdb.dbo.backupfile bf on  bf.backup_set_id = bs.backup_set_id
                                    inner join sys.databases db on db.name = bs.database_name 
                        where 
                                    bs.backup_finish_date >= db.create_date 
                                    and bs.type in (''F'')
                                    and bs.is_copy_only = 0
                                    and coalesce(sys.fn_hadr_is_primary_replica(bs.database_name),-1) in (-1,0,1)
                                    and bf.is_present = 1
                                    and bf.is_readonly = 1
                                    and bf.file_type = ''D''
                        group by
                                    db.service_broker_guid,
                                    db.create_date,
                                    bs.database_name, 
                                    bs.type,
                                    bf.filegroup_name,
                                    bf.logical_name, 
                                    bf.filegroup_guid,
                                    bf.file_guid,
                                    sys.fn_hadr_is_primary_replica(bs.database_name)
'


            
            --############################################################################
            --### construct the SQL command to execute on the remote SQL Server
            --############################################################################
            set @sqlremoteDB = ''
            set @sqlremoteDB +='

            declare @Tbi table (
                        ServerName sysname,
                        SqlInstanceName sysname,
                        SqlServerName sysname,
                        ServiceBrokerGuid uniqueidentifier,
                        DatabaseCreationDate datetime, 
                        DatabaseName sysname,
                        BackupType char(1),
                        LastBackupDate datetime,
                        is_primary bit null,
                        insertdate datetime       
            )

            insert into @Tbi (
                        [ServerName],
                        [SqlInstanceName],
                        [SqlServerName],
                        [ServiceBrokerGuid],
                        [DatabaseCreationDate],
                        [DatabaseName],
                        [BackupType],
                        [LastBackupDate],
                        [is_primary],
                        [insertdate])
            select  
                        lower(convert(sysname,serverproperty(''machinename''))) as ServerName,
                        lower(convert(sysname,serverproperty(''InstanceName''))) as SqlInstanceName,
                        lower(convert(sysname,serverproperty(''ServerName''))) as SqlServerName,
                        db.service_broker_guid as ServiceBrokerGuid,
                        db.create_date as DatabaseCreationDate,
                        bs.database_name as DatabaseName,
                        bs.type as BackupType,
                        max(bs.backup_finish_date) as LastBackupDate,
                        sys.fn_hadr_is_primary_replica(bs.database_name) as is_primary, 
                        ''' + convert(varchar,@cdt,120) + '''     
            from msdb.dbo.backupset bs
                        inner join sys.databases db on db.name = bs.database_name 
                        where bs.type in (''D'',''I'',''P'',''Q'')
                                    and bs.is_copy_only = 0
                                    and coalesce(sys.fn_hadr_is_primary_replica(bs.database_name),-1) in (0,1)
                        group by
                                    db.service_broker_guid,
                                    db.create_date,
                                    bs.database_name,
                                    bs.type,
                                    sys.fn_hadr_is_primary_replica(bs.database_name) 

            select  
                        t.[ServerName],
                        t.[SqlInstanceName],
                        t.[SqlServerName],
                        t.[ServiceBrokerGuid],
                        t.[DatabaseCreationDate],
                        t.[DatabaseName],
                        t.[BackupType],
                        t.[LastBackupDate],
                        bs.[backup_size],
                        t.[is_primary],
                        t.[insertdate]
            from @Tbi t
                        inner join msdb.dbo.backupset bs on 
                                    bs.backup_finish_date = t.LastBackupDate 
                                    and bs.database_name collate database_default = t.DatabaseName collate database_default
                                    and bs.type collate database_default = t.BackupType collate database_default

'

            set @sqlremoteFG = ''
            set @sqlremoteFG +='

            select  
                        lower(convert(sysname,serverproperty(''machinename''))) as ServerName,
                        lower(convert(sysname,serverproperty(''InstanceName''))) as SqlInstanceName,
                        lower(convert(sysname,serverproperty(''ServerName''))) as SqlServerName,
                        db.service_broker_guid as ServiceBrokerGuid,
                        db.create_date as DatabaseCreationDate,
                        bs.database_name as DatabaseName,
                        bs.type as BackupType,
                        bf.filegroup_name,
                        bf.logical_name as file_logicalname,
                        bf.filegroup_guid,
                        bf.file_guid,
                        max(bs.backup_finish_date) as LastBackupDate,
                        max(bf.read_only_lsn) as LastReadOnlyLsn,
                        sys.fn_hadr_is_primary_replica(bs.database_name) as is_primary, 
                        ''' + convert(varchar,@cdt,120) + '''   
            from msdb.dbo.backupset bs
                                    inner join msdb.dbo.backupfile bf on  bf.backup_set_id = bs.backup_set_id
                                    inner join sys.databases db on db.name = bs.database_name 
                        where 
                                    bs.backup_finish_date >= db.create_date 
                                    and bs.type in (''F'')
                                    and bs.is_copy_only = 0
                                    and coalesce(sys.fn_hadr_is_primary_replica(bs.database_name),-1) in (0,1)
                                    and bf.is_present = 1
                                    and bf.is_readonly = 1
                                    and bf.file_type = ''D''
                        group by
                                    db.service_broker_guid,
                                    db.create_date, 
                                    bs.database_name, 
                                    bs.type,
                                    bf.filegroup_name,
                                    bf.logical_name, 
                                    bf.filegroup_guid,
                                    bf.file_guid,
                                    sys.fn_hadr_is_primary_replica(bs.database_name) 
'

            --############################################################################
            --### delete all records in the backup info tables
            --############################################################################
            delete from [dbo].[bakgen_backuplastdt_databases_temp]
            delete from [dbo].[bakgen_backuplastdt_fgreadonly_temp]

            --############################################################################
            --### loop for all replicas involved
            --############################################################################
            declare cur_replica cursor
            static local forward_only
            for 
                        select AgReplicaName
                        from @TAgReplica
                 
            open cur_replica
            fetch next from cur_replica into 
                        @AgReplicaName                    


            while @@fetch_status = 0
            begin 
                                    
                        if @LocalSqlServerName = @AgReplicaName
                        begin 

                                    set @InfoLog = 'Get database backup information on local SQL Server instance ' + QUOTENAME(@AgReplicaName)
                                    execute dbo.bakgen_p_log       
                                               @ModuleName = @ModuleName,
                                               @ProcedureName = @ProcName,
                                                @ExecuteMode = @Execute,
                                               @LogType = 'INFO',
                                               @DatabaseName = null,
                                               @Information = @InfoLog,
                                               @Script = @sqllocalDB
                                    execute sp_executesql @sqllocalDB

                                    set @InfoLog = 'Get read-only filegroup backup information on local SQL Server instance ' + QUOTENAME(@AgReplicaName)
                                    execute dbo.bakgen_p_log       
                                               @ModuleName = @ModuleName,
                                               @ProcedureName = @ProcName,
                                               @ExecuteMode = @Execute,
                                               @LogType = 'INFO',
                                               @DatabaseName = null,
                                               @Information = @InfoLog,
                                               @Script = @sqllocalFG
                                    execute sp_executesql @sqllocalFG

                        end 
                        else
                        begin
                                    --############################################################################
                                    --### construct the PowerShell command to execute on the remote SQL Server
                                    --############################################################################
                                    set @PSCmd  = ''
                                    set @PSCmd += 'PowerShell.exe '
                                    set @PSCmd += '-Command "'
                                    set @PSCmd += '$qrydb = \"' + @sqlremoteDB + '\"; ' 
                                    set @PSCmd += '$qryfg = \"' + @sqlremoteFG + '\"; ' 
                                    set @PSCmd += '$rdb = Invoke-DbaQuery -SqlInstance ' + @AgReplicaName + ' -Query $qrydb; '
                                    set @PSCmd += '$rfg = Invoke-DbaQuery -SqlInstance ' + @AgReplicaName + ' -Query $qryfg; '
                                    set @PSCmd += 'if ($rdb -ne $null) { '
                                    set @PSCmd += 'Write-DbaDbTableData -SqlInstance ' + @LocalSqlServerName + ' -Database ' + db_name() + ' -Schema dbo -Table bakgen_backuplastdt_databases_temp -InputObject $rdb;'
                                    set @PSCmd += '} '
                                    set @PSCmd += 'if ($rfg -ne $null) { '
                                    set @PSCmd += 'Write-DbaDbTableData -SqlInstance ' + @LocalSqlServerName + ' -Database ' + db_name() + ' -Schema dbo -Table bakgen_backuplastdt_fgreadonly_temp -InputObject $rfg;'
                                    set @PSCmd += '} '
                                    set @PSCmd += '"'

                                    set @InfoLog = 'Get backup information on replica SQL Server instance ' + QUOTENAME(@AgReplicaName) + ' executing master..xp_cmdshell PowerShell script'
                                    execute dbo.bakgen_p_log       
                                               @ModuleName = @ModuleName,
                                               @ProcedureName = @ProcName,
                                               @ExecuteMode = @Execute,
                                               @LogType = 'INFO',
                                               @DatabaseName = null,
                                               @Information = @InfoLog,
                                               @Script = @PSCmd

                                    --###remove CRLF for xp_cmdshell and PowerShell 
                                    set @PSCmd = replace(replace(@PSCmd, nchar(13), N''), nchar(10), N' ')
                                    set @PSCmd = replace(@PSCmd, '>', N'^>')
                                    --###Execute the powershell command on the replica and store the result in the temporary tables
                                    exec master..xp_cmdshell @PSCmd
                        end
                        
                        fetch next from cur_replica into 
                                    @AgReplicaName                    


            end
            close cur_replica
            deallocate cur_replica


            --############################################################################
            --### Update and insert backup information in final tables
            --############################################################################

            --###Update first the database creation date with the local ones
            Update t
                        set t.DatabaseCreationDate = db.create_date
            from [dbo].[bakgen_backuplastdt_databases_temp] t
                        inner join sys.databases db 
                                    on db.name collate database_default = t.DatabaseName collate database_default 
                                               and db.service_broker_guid = t.ServiceBrokerGuid

            Update t
                        set t.DatabaseCreationDate = db.create_date
            from [dbo].[bakgen_backuplastdt_fgreadonly_temp] t
                        inner join sys.databases db 
                                    on db.name collate database_default = t.DatabaseName collate database_default 
                                               and db.service_broker_guid = t.ServiceBrokerGuid




            BEGIN TRY

                        begin transaction 

                        delete f
                                    from [dbo].[bakgen_backuplastdt_databases_temp] t
                                               inner join [dbo].[bakgen_backuplastdt_databases] f 
                                                           on f.DatabaseCreationDate = t.DatabaseCreationDate
                                                                       and f.DatabaseName = t.DatabaseName 
                                                                       and f.BackupType = t.BackupType 
                                                                       and f.ServerName = t.ServerName 
                                                                       and t.SqlInstanceName = f.SqlInstanceName
                                               where f.LastBackupDate < t.LastBackupDate

                        Insert into [dbo].[bakgen_backuplastdt_databases] (
                                    ServerName,
                                    SqlInstanceName,
                                    SqlServerName,
                                    DatabaseCreationDate,
                                    DatabaseName,
                                    BackupType,
                                    LastBackupDate,
                                    LastBackupSize,
                                    is_primary,
                                    insertdate 
                        )
                        select 
                                    t.ServerName,
                                    t.SqlInstanceName,
                                    t.SqlServerName,
                                    t.DatabaseCreationDate,
                                    t.DatabaseName,
                                    t.BackupType,
                                    t.LastBackupDate,
                                    t.LastBackupSize,
                                    t.is_primary,
                                    t.insertdate 
                                    from [dbo].[bakgen_backuplastdt_databases_temp] t
                                               where not exists (select 1 from [dbo].[bakgen_backuplastdt_databases] f 
                                                                                                                      where f.DatabaseName = t.DatabaseName 
                                                                                                                                  and f.BackupType = t.BackupType 
                                                                                                                                  and f.ServerName = t.ServerName 
                                                                                                                                  and t.SqlInstanceName = f.SqlInstanceName)
                                    
                        
                        commit

                        begin transaction

                        delete f
                                    from [dbo].[bakgen_backuplastdt_fgreadonly_temp] t
                                               inner join [dbo].[bakgen_backuplastdt_fgreadonly] f 
                                                           on f.DatabaseName = t.DatabaseName 
                                                                       and f.BackupType = t.BackupType 
                                                                       and f.filegroup_name = t.filegroup_name
                                                                       and f.ServerName = t.ServerName 
                                                                       and f.SqlInstanceName = t.SqlInstanceName
                                               where f.LastBackupDate < t.LastBackupDate


                        Insert into [dbo].[bakgen_backuplastdt_fgreadonly] (
                                    ServerName,     
                                    SqlInstanceName,
                                    SqlServerName,           
                                    DatabaseCreationDate,
                                    DatabaseName,            
                                    BackupType,
                                    filegroup_name,
                                    file_logicalname,          
                                    filegroup_guid, 
                                    file_guid,          
                                    LastBackupDate,          
                                    LastBackupReadOnlyLsn,
                                    is_primary,
                                    insertdate                     
                        )
                        select 
                                    t.ServerName,   
                                    t.SqlInstanceName,
                                    t.SqlServerName,
                                    t.DatabaseCreationDate,
                                    t.DatabaseName,          
                                    t.BackupType,
                                    t.filegroup_name,
                                    t.file_logicalname,        
                                    t.filegroup_guid,           
                                    t.file_guid,        
                                    t.LastBackupDate,        
                                    t.LastBackupReadOnlyLsn,
                                    t.is_primary,
                                    t.insertdate                   
                        from [dbo].[bakgen_backuplastdt_fgreadonly_temp] t                                        
                                    where not exists (
                                               select 1 from  [dbo].[bakgen_backuplastdt_fgreadonly] f 
                                               where f.DatabaseName = t.DatabaseName 
                                                                       and f.BackupType = t.BackupType 
                                                                       and f.filegroup_name = t.filegroup_name
                                                                       and f.ServerName = t.ServerName 
                                                                       and t.SqlInstanceName = f.SqlInstanceName)

                        
                        commit
            END TRY
            BEGIN CATCH
                SELECT 
                                    @ErrorMessage = ERROR_MESSAGE(), 
                                    @ErrorSeverity = ERROR_SEVERITY(), 
                                    @ErrorState = ERROR_STATE();

                        IF @@TRANCOUNT > 0
                                    ROLLBACK
                        
                        raiserror(@ErrorMessage, @ErrorSeverity, @ErrorState);

            END CATCH



RETURN;

END TRY
BEGIN CATCH
    SELECT 
        @ErrorMessage = ERROR_MESSAGE(), 
        @ErrorSeverity = ERROR_SEVERITY(), 
        @ErrorState = ERROR_STATE();

            set @InfoLog = '@ErrorState = ' + convert(nvarchar, @ErrorState) + '/@ErrorSeverity = ' + convert(nvarchar, @ErrorSeverity) + '/@ErrorMessage = ' + @ErrorMessage
            execute dbo.bakgen_p_log       
                        @ModuleName = @ModuleName,
                        @ProcedureName = @ProcName,
                        @ExecuteMode = @Execute,
                        @LogType = 'ERROR',
                        @DatabaseName = null,
                        @Information = @InfoLog,
                        @Script = null

    raiserror(@ErrorMessage, @ErrorSeverity, @ErrorState);
END CATCH;

RETURN
END

Other Objects needed

As mentioned above I used the DBATools Write-DbaDbTableData function, so need to install it before being able to run the above stored procedure.

I share also the 2 other objects used in the above stored procedure, but of course you can adapt the code to your needs

Creation of the log table:

--########################################################
--###Backup generator - logs
--########################################################

USE [<YourDatabaseName>]
GO
/*
if OBJECT_ID('[dbo].[bakgen_logs]') is not null
	drop table [dbo].[bakgen_logs]
*/
create table [dbo].[bakgen_logs] (
	id bigint identity(1,1) not null,
	LogDate datetime,
	SqlServerName sysname,
	ModuleName sysname,
	ProcedureName sysname,
	ExecuteMode char(1),
	LogType nvarchar(50),
	DatabaseName sysname null,
	Information nvarchar(max) null,
	Scripts nvarchar(max) null,
CONSTRAINT [PK_bakgen_logs] PRIMARY KEY CLUSTERED 
(
	[id] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
)
GO

Creation of the stored procedure to write the logs:

use [<YourDatabaseName>]
if OBJECT_ID('dbo.bakgen_p_log') is not null
	drop procedure dbo.bakgen_p_log 
go

CREATE PROCEDURE dbo.bakgen_p_log 
(
	@ModuleName sysname,
	@ProcedureName sysname,
	@ExecuteMode char(1),
	@LogType nvarchar(50),
	@DatabaseName sysname = null,
	@Information nvarchar(max) =  null,
	@Script nvarchar(max)  = null
)

AS
/************************************
*   dbi-services SA, Switzerland    *
*   http://www.dbi-services.com        *
*************************************
    Group/Privileges..: DBA
    Script Name......:	bakgen_p_log.sql
    Author...........:	Christophe Cosme
    Date.............:	2019-09-20
    Version..........:	SQL Server 2016 / 2017
    Description......:	write information to the log table to keep trace of the step executed

    Input parameters.: 

	Output parameter: 
				
************************************************************************************************
    Historical
    Date        Version    Who    Whats		Comments
    ----------  -------    ---    --------	-----------------------------------------------------
    2019-10-14  1.0        CHC    Creation
************************************************************************************************/ 
BEGIN 

BEGIN TRY
	
	--###variable to store error message
	declare @errmsg varchar(4000)

	if OBJECT_ID('[dbo].[bakgen_logs]') is null
	begin
		set @errmsg = 'bakgen_p_log : table not found - be sure the table exists'
		set @errmsg += '	table name = [dbo].[bakgen_logs]' 
		raiserror (@errmsg,11,1);
	end		

	insert into [dbo].[bakgen_logs] (
		LogDate,
		SqlServerName,
		ModuleName,
		ProcedureName,
		ExecuteMode,
		LogType,
		DatabaseName,
		Information,
		Scripts
		)
	values(
		getdate(),
		convert(sysname,SERVERPROPERTY('servername')),
		@ModuleName,
		@ProcedureName,
		@ExecuteMode,
		@LogType,
		@DatabaseName,
		@Information,
		@Script
		)


RETURN;

END TRY
BEGIN CATCH
	declare 
    @ErrorMessage  NVARCHAR(4000), 
    @ErrorSeverity INT, 
    @ErrorState    INT;
    SELECT 
        @ErrorMessage = ERROR_MESSAGE(), 
        @ErrorSeverity = ERROR_SEVERITY(), 
        @ErrorState = ERROR_STATE();
 
    -- return the error inside the CATCH block
    raiserror(@ErrorMessage, @ErrorSeverity, @ErrorState);
END CATCH;

RETURN
END

Conclusion

Triggering PowerShell from a stored procedure did the trick for my special case and is very practical. But to find the right syntax to make the script running through xp_CmdShell was not so trivial. I admit to spend sometimes to figure out what was causing the issue.
But I definitely enjoyed the solution for retrieving information outside the local SQL Server instance.

Cet article SQL Server – Collecting last backup information in an AlwaysOn environment est apparu en premier sur Blog dbi services.

Restore S3 Object with AWSPOWERSHELL

$
0
0

AWS S3 offers different Storage Classes, allowing to optimize cost among others.
For instance, some classes are used for archiving purposes: S3 Glacier and S3 Glacier Deep Archive. It means the storage cost is the lowest you can obtain, but your data is not available immediately and the access cost is increased.

In the case of S3 archive classes, retrieving the data is not cost-effective because this is clearly not what it is aimed for. This is for data you want to keep for some reasons (legal, insurance…), but you need very rare access on it.
Database backups are clearly one scenario where these storage classes are designed for.

But what does happen if I need this data? How do I proceed? We will answer to these questions using AWSPOWERSHELL module.
Of course, AWS CLI is another approach possible and well-documented. But, in my opinion, this is less reusable (integration in a custom module less convenient for instance) and less in the PowerShell “philosophy”.

I- Select your S3 object

First of all, you need to find the object you have to retrieve. To do so, several information are necessary:

  • the bucket name where the object resides (mandatory)
  • the key (optional): returns the object matching the exact key
  • the key prefix (optional): returns all objects with a key starting with this prefix

The cmdlet you need for this is called Get-S3Object. Here are some examples of usage:

# Retrieve object from a specific key
Get-S3Object -BucketName $BucketName -Key $Key

# Retrieve objects from a key prefix
Get-S3Object -BucketName $BucketName -KeyPrefix $KeyPrefix

It is not possible from this cmdlet to retrieve an object only with its name: you need to know the key or the beginning of the key (key prefix).
Of course, a research inside PowerShell is possible, but you will need to retrieve ALL objects in a bucket before doing the research… You are dependent on the number of objects in your bucket.

Moreover, to retrieve information regarding restore status, you need to look into metadata with cmdlet Get-S3ObjectMetadata.

To make the research simple with the desired information, I created a custom function to accept the partial name of a S3 object as input and to personalize the output:

Function Get-dbiS3Object(){
    param(
        [Parameter(Mandatory=$true)]
        [String]
        $BucketName,
        [String]
        $Key,
        [String]
        $KeyPrefix = '',
        [String]
        $Name = ''
    )

    $Command = 'Get-S3Object -BucketName ' + '"' + $BucketName + '"';

    If ($KeyPrefix){
        $Command += ' -KeyPrefix ' + '"' + $KeyPrefix + '"';
    }
    If ($Key){
        $Command += ' -Key' + '"' + $Key + '"';
    }
    If ($Name){
        $Command += ' | Where-Object Key -Match ' + '"' + $Name + '"';
    }

    $Objects = Invoke-Expression $Command;


    If ($Objects){
        @($Objects) | ForEach-Object -Begin {`
                        [System.Collections.ArrayList] $S3CustomObjects = @();}`
                  -Process {`
                           $Metadata = $_ | Get-S3ObjectMetadata;`
                           $S3CustomObj = [PSCustomObject]@{`
                                         BucketName = "$($_.BucketName)";`
                                         StorageClass = "$($_.StorageClass)";`
                                         LastModified = "$($_.LastModified)";`
                                         SizeInB = "$($_.Size)";`
                                         RestoreExpirationUtc = "$($Metadata.RestoreExpiration)";`
                                         RestoreInProgress = "$($Metadata.RestoreInProgress)";`
                                         ExpirationRule = "$($Metadata.Expiration.RuleId)";`
                                         ExpiryDateUtc= "$($Metadata.Expiration.ExpiryDateUtc)";`
                           };`
                           $Null = $S3CustomObjects.Add(S3CustomObj);`
                  };
    }

  return $S3CustomObjects;
}

2- Restore the S3 Object

Once you have selected your objects, you have to create a request to make your objects accessible. Indeed, in Glacier, your objects are not accessible until a request is performed: they are archived (like “frozen”).
For Glacier, it exists 3 archive retrieval options:

  • Expedited: 1-5 minutes for the highest cost
  • Standard: 3-5 hours for a lower cost
  • Bulk: 5-12 hours for the lowest cost

So after this request you will have to wait, depending on your archive retrieval options.

This demand is performed with the cmdlet Restore-S3Object.
Here is an example of usage:

# CopyLifetimeInDays is the number of days the object remains accessible before it is frozen again
Restore-S3Object -BucketName $element.BucketName -Key $element.Key -CopyLifetimeInDays $CopyLifetimeInDays -Tier $TierType

By using our previous custom cmdlet called Get-dbiS3Object, we can also build a new custom cmdlet to simplify the process:

Function Restore-dbiS3Object (){
    param(
        $CustomS3Objects,
        [String]
        $Key,
        [String]
        $KeyPrefix,
        [String]
        $BucketName,
        [Amazon.S3.GlacierJobTier]
        $Tier='Bulk',              # Default archive retrieval option if nothing specified
        [int]
        $CopyLifetimeInDays = 5    # Default number of days if nothing specified
    )

    If ($CustomS3Objects){
        @($CustomS3Objects) | Foreach-Object -Process {`
            If ( (-not ($_.RestoreExpirationUtc) -and (-not ($_.RestoreInProgress) -and ($_.StorageClass -eq 'Glacier') -and ($_.SizeInB -gt 0)))) {`
                Restore-S3Object -BucketName $_.BucketName -Key $_.Key -CopyLifetimeInDays $CopyLifetimeInDays -Tier $TierType);`
            }`
        }
    }
    elseif ($Key -and $BucketName){
        $Objects = Get-dbiS3Object -BucketName $BucketName -Key $Key;
        Restore-dbiS3Object -CustomS3Objects $Objects;
    }
    elseif ($KeyPrefix -and $BucketName){
        $Objects = Get-dbiS3Object -BucketName $BucketName -KeyPrefix $KeyPrefix;
        Restore-dbiS3Object -CustomS3Objects $Objects;
    }
}

To check if the retrieval is finished and if the object is accessible for download, you can obtain this information with the cmdlet Get-dbiS3Object.

Of course, these 2 custom functions are perfectible and could be customized differently. The goal of this blog is mostly to introduce the potential of this PowerShell module, and give examples of integration in a custom PowerShell module to make daily life easier 🙂

Cet article Restore S3 Object with AWSPOWERSHELL est apparu en premier sur Blog dbi services.

SQL Server: Quickly clean backup history with dbatools

$
0
0

I just had to restore a database in production for my customer. Before doing the restore I have the habit to query the msdb.dbo.backupset table to get an overview of the last backups.

When running my query, I felt it was taking longer than usual. So out of curiosity, I looked at the SSMS standard report “Disk Usage by Top Tables”. Here is the output.

This instance contains dozens of databases in Always On Availability Groups with a transaction log backup frequency set to 30 minutes. The backup history has never been cleaned, which explain the large number of rows.

It’s not often that I see the msdb database with a size of 3.5GB, so I decided it’s time to delete the backup history. My customer got many instances that are configured and managed the same way so I’m sure this phenomenon will be present on many servers.

I could easily use the system stored procedure sp_delete_backuphistory but I instead decided to use PowerShell and dbatools. I just recently started to use dbatools and I want to practice more using PowerShell for tasks like this one that needs to be done on many instances.

First, like the SSMS report I’d like to get the row count and the amount of data used by the backup history tables in all my MSDB databases. I want to measure the actual gain in data space after the cleaning. To do this, I decided to use the Get-DbaDbTable function from dbatools.

Get-DbaDbTable -SqlInstance 'InstanceName' -Database msdb `
    | Where-Object {$_.Name -Like 'backup*'} `
    | Select-Object -Property Name, RowCount, DataSpaceUsed `
    | Out-GridView

I use a Central Management Server as an inventory for my SQL servers.

The list of servers can be easily retrieved from the CMS with Get-DbaRegisteredServer.

$Servers = Get-DbaRegisteredServer -SqlInstance 'MyCmsInstance' | Where-Object {$_.Group -Like '*Prod*'};

I have 36 production servers.

PS C:\> $Servers.Count
36

Now, looping through each instance I do the sum of the backup history rows with the total space used.

$Servers = Get-DbaRegisteredServer -SqlInstance 'MyCmsInstance' | Where-Object {$_.Group -Like '*Prod*'};
foreach ($srv in $Servers) {
    Get-DbaDbTable -SqlInstance $srv -Database msdb `
        | Where-Object {$_.Name -Like 'backup*'} `
        | ForEach-Object -Process {$rowsBefore+=$_.RowCount; $sizeBefore+=$_.DataSpaceUsed}
}
Write-Output "backup history total rows: $rowsBefore" 
Write-Output "backup history total size: $sizeBefore" 

PS C:\>
backup history total rows: 31989560
backup history total size: 10343088

I’ve got a total of almost 32 Million rows of backup history on my production servers for a total data size exceeding 10 GB.

To clean the backup history, I use the Remove-DbaDbBackupRestoreHistory function. I decide to keep a backup history of about 4 months, so I choose the arbitrary number of 120 as value for the KeepDays parameter.

foreach ($srv in $Servers) {
    Remove-DbaDbBackupRestoreHistory -SqlInstance $srv -KeepDays 120 -Confirm:$false
}

After cleaning the backup history I run once again the first loop to get the msdb tables information so I can compare the row count and data space used before and after the Remove function.
Here is the result.

Diff rows: 24654309
Diff size: 8047072

I just deleted over 24 Million rows, about 8GB of data space in the msdb databases over 36 instances. All this was done with a few lines of PowerShell and dbatools in a really short time. As a DBA managing dozens of instances, automating and scripting tasks like this with dbatools becomes very easy and can save a lot of time.

You can find below the whole script, please feel free to comment if you think it can be written in a more efficient way. I will take any advice on PowerShell scripting.

$Servers = Get-DbaRegisteredServer -SqlInstance 'MyCmsInstance' | Where-Object {$_.Group -Like '*Prod*'};
foreach ($srv in $Servers) {
    Get-DbaDbTable -SqlInstance $srv -Database msdb `
        | Where-Object {$_.Name -Like 'backup*'} `
        | ForEach-Object -Process {$rowsBefore+=$_.RowCount; $sizeBefore+=$_.DataSpaceUsed}
}
Write-Output "backup history total rows: $rowsBefore" 
Write-Output "backup history total size: $sizeBefore" 

foreach ($srv in $Servers) {
    Remove-DbaDbBackupRestoreHistory -SqlInstance $srv -KeepDays 120 -Confirm:$false
}

Start-Sleep -Seconds 10

foreach ($srv in $Servers) {
    Get-DbaDbTable -SqlInstance $srv -Database msdb `
        | Where-Object {$_.Name -Like 'backup*'} `
        | ForEach-Object -Process {$rowsAfter+=$_.RowCount; $sizeAfter+=$_.DataSpaceUsed}
}
$diffRows= $rowsBefore-$rowsAfter
$diffSize= $sizeBefore-$sizeAfter

Write-Output "Diff rows: $diffRows" 
Write-Output "Diff size: $diffSize"

 

Cet article SQL Server: Quickly clean backup history with dbatools est apparu en premier sur Blog dbi services.

How to run Avamar backup on SQL Agent Job with PowerShell

$
0
0

By one of our customer we use Avamar for the backup and restore solution. I was asked by this customer to find a solution to run Avamar backups for a group of databases on a specified instance. In fact, we currently try a database clone solution on some instances and clone’s databases must not be backed up, but the rest of the databases must be.

After some discussions with my colleague I decided to use a PowerShell script which will be called in a SQL job step.
This PowerShell script will select the databases to back-up and use the avsql command to run the backups. Avsql is the command line interface for Avamar and SQL Server, it has plenty of functionalities including operations like browse, backup or restore of an instance.

Prerequisites

There are some prerequisites to use this command line.
The first one is to create an avsql command file where we have to define some parameters for the command we would like to execute:

  • operation to perform
  • id of the user used to run the operation
  • user password (encrypted)
  • Avamar server
  • client where to run the operation
  • retention 60 days

Here a picture of my file:

The second one is to create the user mentioned above.
For that open you Avamar Administrator console, go to Administration, Account Management and search for the client where you want to create the user. In the User Tab, right click on the mouse button and add new user. Select the role “Back up/Restore User”, give a name to your user and enter a password. Once done the user is created:

Job creation

Once done we can connect to the SQL Server instance and create a SQL Agent Job.
I don’t want to waste too much time on job creation as there is no added value with that:

The step will just have to run a PowerShell script which will be executed by the SQL Server Agent Service Account:

The interesting part is more in the script that I copy here:

$instance = 'yourserver\instancename'
$DatabasesList = invoke-sqlcmd -query "select distinct db_name(database_id) as name from sys.master_files where physical_name not like '%SqlClone%'" -serverinstance $instance -QueryTimeout 1000 -ConnectionTimeout 1000 -SuppressProviderContextWarning
Foreach ($Database in $DatabasesList)
{
#prepare database name for Avamar backup
$dbname = $Database.name
$db = "$instance/$dbname"
#prepare command line for Avamar backup
$BackupCmd = "avsql --flagfile=C:\ProgramData\Bis\Avamar\avsql_backup.cmd --brtype=full --label=Daily_Full_Clone_Server """ + $db + """"
#Run backup command via Avamar
powershell.exe -command "Start-Process 'cmd' -verb runas -ArgumentList '/c $BackupCmd'"
}

The trickiest part has been to find the right command to execute the avsql command.
Indeed this command must be executed with elevation via a cmd process.
This part of the script powershell.exe -command “Start-Process ‘cmd’ -verb runas -ArgumentList ‘/c $BackupCmd'” shows how I finally managed it.
I executed a PowerShell command which runs a command prompt with elevation with as argument my Avamar command line.
Once scheduled our job will create Avamar full backup for the selected databases.

I hope this can help Avamar users 😉

Cet article How to run Avamar backup on SQL Agent Job with PowerShell est apparu en premier sur Blog dbi services.

SQL Server: Synchronize logins on AlwaysOn replicas with dbatools

$
0
0

The SQL Server environment  I worked with today has dozens of SQL Server instances using AlwaysOn Availability Groups for High Availability.
When a login is created on the Primary replica of an Availability Group it is not synchronized automatically on secondary replicas. This might cause some issues after a failover (Failed logins).

Since this is not done automatically by SQL Server out of the box the DBA has to perform this task.
To avoid doing this with T-SQL (sp_help_revlogin) or SSMS I use the magical dbatools and perform the following tasks once a week.dbatools

  1. Get the number of logins on each instance.
$primary = Get-DbaLogin -SqlInstance 'SQL01\APPX'
$primary.Count

$secondary = Get-DbaLogin -SqlInstance 'SQL02\APPX'
$secondary.Count
PS C:\> $primary.Count
41
PS C:\> $secondary.Count
40
  1. If numbers don’t match, I use the Copy-Login function to synchronize the missing login.
Copy-DbaLogin -Source 'SQL01\APPX' -Destination 'SQL02\APPX' `
	-Login (($primary | Where-Object Name -notin $secondary.Name).Name)

PS C:\>
Type             Name         Status     Notes
----             ----         ------     -----
Login - SqlLogin loginName    Successful

Obviously, there are many drawbacks to this process;

  • Having the same number of logins doesn’t mean they are actually the same.
    Some logins can be missing on both sides compared to the other one and both instances have the same number of logins.
  • I need to know which instance is the current primary replica (-Source in Copy-DbaLogin)
  • This is a manual process I do on every pair of instances using AlwaysOn.
  • I want a script that can manage any number of secondary replica

So I decided to write a new script that would automatically synchronize login from primary replicas to all secondary replicas. The only parameter I want to use as input for this script is the name of the listener.

Here is the simplest version of this script I could write;

$lsn = 'APP01-LSTN'

$primaryReplica =    Get-DbaAgReplica -SqlInstance $lsn | Where Role -eq Primary
$secondaryReplicas = Get-DbaAgReplica -SqlInstance $lsn | Where Role -eq Secondary

# primary replica logins
$primaryLogins = (Get-DbaLogin -SqlInstance $primaryReplica.Name)

$secondaryReplicas | ForEach-Object {
    # secondary replica logins
    $secondaryLogins = (Get-DbaLogin -SqlInstance $_.Name)

    $diff = $primaryLogins | Where-Object Name -notin ($secondaryLogins.Name)
    if($diff) {
        Copy-DbaLogin -Source $primaryReplica.Name -Destination $_.Name -Login $diff.Nane
    }   
}

Using just the listener name with Get-DbaAgReplica I can get all the replicas by Role, either Primary or Secondary.
Then I just need to loop through the secondary replicas and call Copy-DbaLogin.

I use a Central Management Server as an inventory for my SQL servers. I have groups containing only listeners.

CMS

The list of listeners can be easily retrieved from the CMS with Get-DbaRegisteredServer.

$Listeners= Get-DbaRegisteredServer -SqlInstance 'MyCmsInstance' | Where-Object {$_.Group -Like '*Prod*Listener'};

Now, looping through each listener I can sync dozens of secondary replicas in my SQL Server Prod environment with a single script run.
I had some issues with instances having multiple availability groups so I added: “Sort-Object -Unique”.
Notice I also filtered out some logins I don’t want to synchronize.

$Listeners = Get-DbaRegisteredServer -SqlInstance 'MyCmsInstance' | Where-Object {$_.Group -Like '*Prod*Listener*'};

foreach ($lsn in $Listeners) {

    $primaryReplica =    Get-DbaAgReplica -SqlInstance $lsn.ServerName | Where Role -eq Primary | Sort-Object Name -Unique
    $secondaryReplicas = Get-DbaAgReplica -SqlInstance $lsn.ServerName | Where Role -eq Secondary | Sort-Object Name -Unique
    <#
    Some instances have more than 1 AvailabilityGroup
        => Added Sort-Object -Unique
    #>

    # primary replica logins
    $primaryLogins = (Get-DbaLogin -SqlInstance $primaryReplica.Name -ExcludeFilter '##*','NT *','BUILTIN*', '*$')
    
    $secondaryReplicas | ForEach-Object {
        # secondary replica logins
        $secondaryLogins = (Get-DbaLogin -SqlInstance $_.Name -ExcludeFilter '##*','NT *','BUILTIN*', '*$')
        
        $diff = $primaryLogins | Where-Object Name -notin ($secondaryLogins.Name)
        if($diff) {
            Copy-DbaLogin -Source $primaryReplica.Name -Destination $_.Name -Login ($diff.Nane) -Whatif
        } 
    }  
}

Do not test this script in Production. Try it in a safe environment first, then remove the “-WhatIf” switch.
The next step for me might be to run this script on a schedule. Or even better, trigger the execution after an Availability Group failover?

Copy-DbaLogin is one of many dbatools commands that can be very useful to synchronize objects between instances. You can find a few examples below.

Cet article SQL Server: Synchronize logins on AlwaysOn replicas with dbatools est apparu en premier sur Blog dbi services.


How to create an Azure SQL Database using Azure PowerShell

$
0
0

In this blog post, I’ll go through the steps to create an Azure SQL Database using Azure PowerShell.

Introduction to Azure SQL Database

The SQL database services provided by Microsoft on the cloud are now grouped under the name of Azure SQL.

The Azure SQL family contains services that I will briefly summarize;

  • Azure SQL Database – DBaaS (Database-as-a-Service)
  • Azure SQL Managed Instance – PaaS (Platform-as-a-Service)
  • SQL Server on Azure VMs – IaaS (Infrastructure-as-a-Service)

In this blog post, I will use Azure SQL Database.

Azure SQL Database offers the following deployment options:

  • Single database – a fully-managed, isolated database
  • Elastic pool – a collection of single databases with a shared set of resources

I will not describe in detail this service but basically, it is a fully managed SQL database similar to a contained database in SQL Server.

All the steps below can be done on the Azure Portal. For this blog post, I’ll only use Azure PowerShell which you can install on your operating system or use online with Azure Cloud Shell.

1) Install and Import Az module

First, we need to install Azure PowerShell which provides a set of commands to manage your Azure resources from your favorite operating system; Windows, macOS, and Linux.

PS C:\> Install-Module Az
PS C:\> Get-InstalledModule -Name Az | select Name, Version

Name Version
---- -------
Az   4.1.0

PS C:\> Import-Module Az

2) Sign in to Azure

Connect to your Tenant using your Tenant ID.
You can find your Tenant ID in the Azure Portal under “Azure Active Directory”.Azure Active Directory

PS C:\> Connect-AzAccount -Tenant 'b9c70123-xxx-xxx-xxx-xxxx'

Account           SubscriptionName     TenantId                      Environment
-------           ----------------     --------                      -----------
my@Email.com      Visual Studio Ent    b9c70978-xxx-xxx-xxx-xxxx     AzureCloud

PS C:\>

Then, if you use multiple Azure subscriptions, select the one you want to work with.

PS C:\> Set-AzContext -SubscriptionId '891f5acc-xxx-xxx-xxx-xxxx'

3) Create a Resource Group

Let’s start with creating a Resource Group. A resource group is a container that holds related resources for an Azure solution.

PS C:\> New-AzResourceGroup -Name 'SQLFailover-lab-rg' -Location 'France Central'

ResourceGroupName : SQLFailover-lab-rg
Location          : francecentral
ProvisioningState : Succeeded
Tags              :
ResourceId        : /subscriptions/891f5acc-xxx-xxx-xxx-xxxx/resourceGroups/SQLFailover-lab-rg

To list all your Resource Groups use the Get-AzResourceGroup command:

Get-AzResourceGroup | select ResourceGroupName

4) Create an SQL Server

Create a logical server with a unique server name to host our SQL databases.

New-AzSqlServer -ResourceGroupName 'SQLFailover-lab-rg' `
    -ServerName 'snasqlsrv-lab-01' `
    -Location 'France Central' `
    -SqlAdministratorCredentials $(New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList "LabSqlAdmin", $(ConvertTo-SecureString -String "MyPassword" -AsPlainText -Force))

The last parameter defines the credentials I will use to connect as an administrator to my SQL Database server.
Once the server is created you get the FQDN that will be used for connections.

PS C:\> Get-AzSqlServer | select FullyQualifiedDomainName

FullyQualifiedDomainName
------------------------
snasqlsrv-lab-01.database.windows.net

5) Create a Server Firewall Rule

To access the server and all the databases from my client computer I need to create a server firewall rule.
Here I use a WebRequest to get my public IP into a variable and then create the server firewall rule.

$myIp = (Invoke-WebRequest ifconfig.me/ip).Content
New-AzSqlServerFirewallRule -ResourceGroupName 'SQLFailover-lab-rg' `
    -ServerName 'snasqlsrv-lab-01' `
    -FirewallRuleName "AllowedIPs" -StartIpAddress $myIp -EndIpAddress $myIp

6) Connect to the SQL Server from SSMS

The SQL Server is now accessible from my computer client on port 1433. I can connect to it using SSMS.

ConnectSSMS
SSMS

7) Create a database

The following command will create a database named “DB01” with an S0 performance level and using the sample schema “AventureWorksLT”.

New-AzSqlDatabase  -ResourceGroupName 'SQLFailover-lab-rg' `
    -ServerName 'snasqlsrv-lab-01' `
    -DatabaseName 'DB01' `
    -RequestedServiceObjectiveName "S0" `
    -SampleName "AdventureWorksLT"

This is it. We just a created an Azure SQL Database with a few commands.

Bonus: Creating a Copy of the database

I just want to mention a nice T-SQL command with Azure SQL Database that doesn’t exist on-premise: “CREATE DATABASE AS A COPY”.
This command creates a copy of a database with a new name. This replace the backup/”restore with move” that we do sometimes on SQL Server.

Cleanup

When you’re done with your tests you can delete all resources in the resource group (firewall rules, server, databases) with a single command;

PS C:\> Remove-AzResourceGroup -ResourceGroupName 'SQLFailover-lab-rg'

 

 

Cet article How to create an Azure SQL Database using Azure PowerShell est apparu en premier sur Blog dbi services.

Publishing a PowerShell script to AWS Lambda

$
0
0

I’ve done some Lambda functions with Python in the past and it was quite easy to publish that to Lambda (by just uploading a zip file with all my code and dependencies). You might ask yourself why I want to do that with PowerShell but the reason is quite simple: There was a requirement at a customer to automatically collect all the KBs that are installed in the AWS Windows WorkSpaces for compliance reasons. Doing that for EC2 or on-prem instances is quite easy using Lambda for Python against SSM when you are using SSM for patching, but if you want to list the installed KBs of your deployed AWS WorkSpaces you need a different way of doing that. After discussing that with AWS Support it turned out that the easiest solution for this is to use the PowerShell Get-HotFix module remotely against the AWS WorkSpaces. Easy, I thought, when I can deploy Python code in Lambda I can easily do this for PowerShell as well. But this is definitely not true as the process is quite different. So, here we go …

The first bit you need to prepare is a PowerShell development environment for AWS. As I am running Linux (KDE Neon, if you want to know exactly), and PowerShell is available on Linux since quite some time, I’ll be showing how to do that on Linux (the process is more or less the same for Windows though).
Obviously PowerShell needs to be installed and this is documented by Microsoft quite well, no need to further explain this. Basically it is matter of:

$ wget -q https://packages.microsoft.com/config/ubuntu/18.04/packages-microsoft-prod.deb
$ sudo dpkg -i packages-microsoft-prod.deb
$ sudo apt-get update
$ sudo add-apt-repository universe
$ sudo apt-get install -y powershell

… and that’s it (take care to follow the steps for your Linux distribution). Once that is done PowerShell can be started:

$ pwsh
PowerShell 7.0.2
Copyright (c) Microsoft Corporation. All rights reserved.

https://aka.ms/powershell
Type 'help' to get help.

PS /home/dwe> 

The first additional module you’ll need is AWSLambdaPSCore:

PS /home/dwe> Install-Module AWSLambdaPSCore -Scope CurrentUser

Untrusted repository
You are installing the modules from an untrusted repository. If you trust this repository, change its InstallationPolicy value by running the 
Set-PSRepository cmdlet. Are you sure you want to install the modules from 'PSGallery'?
[Y] Yes  [A] Yes to All  [N] No  [L] No to All  [S] Suspend  [?] Help (default is "N"): Y

Usually you want to work with other AWS Services in your Lambda code so it is recommended to install the AWS.Tools.Installer module as it provides a convenient way for installing the various tools required for working with the various AWS services. In addition the AWSPowerShell.NetCore module is required:

PS /home/dwe> Install-Module -Name AWS.Tools.Installer -Force    
PS /home/dwe> Install-Module -name AWSPowerShell.NetCore -Force 

Now, dependend on which AWS services you want to work with, just install what you need (in this example EC2, S3 and WorkSpaces):

PS /home/dwe> Install-AWSToolsModule AWS.Tools.EC2,AWS.Tools.S3 -CleanUp -Force                                                                             
Installing module AWS.Tools.EC2 version 4.0.6.0                                                                                                             
Installing module AWS.Tools.S3 version 4.0.6.0                                                                                                              
PS /home/dwe> Install-AWSToolsModule AWS.Tools.Workspaces -CleanUp -Force                                                                                   
Installing module AWS.Tools.WorkSpaces version 4.0.6.0  

Once you have that ready you can use the AWS tools for PowerShell to generate a template you can start with:

PS /home/dwe> Get-AWSPowerShellLambdaTemplate                                                                                                                                                                                                                                                                           Template                     Description                                                                                                                    
--------                     -----------                                                                                                                    
Basic                        Bare bones script                                                                                                              
CloudFormationCustomResource PowerShell handler base for use with CloudFormation custom resource events
CodeCommitTrigger            Script to process AWS CodeCommit Triggers
DetectLabels                 Use Amazon Rekognition service to tag image files in S3 with detected labels.
KinesisStreamProcessor       Script to be process a Kinesis Stream
S3Event                      Script to process S3 events
S3EventToSNS                 Script to process SNS Records triggered by S3 events
S3EventToSNSToSQS            Script to process SQS Messages, subscribed to an SNS Topic that is triggered by S3 events
S3EventToSQS                 Script to process SQS Messages triggered by S3 events
SNSSubscription              Script to be subscribed to an SNS Topic
SNSToSQS                     Script to be subscribed to an SQS Queue, that is subscribed to an SNS Topic
SQSQueueProcessor            Script to be subscribed to an SQS Queue


PS /home/dwe> cd ./Documents/aws
PS /home/dwe/Documents/aws> New-AWSPowerShellLambda -ScriptName MyFirstPowershellLambda -Template Basic
Configuring script to use installed version 4.0.6.0 of (@{ ModuleName = 'AWS.Tools.Common'; ModuleVersion = '4.0.5.0' }.Name)
Created new AWS Lambda PowerShell script MyFirstPowershellLambda.ps1 from template Basic at /home/dwe/Documents/aws/MyFirstPowershellLambda

PS /home/dwe/Documents/aws/MyFirstPowershellLambda> ls
MyFirstPowershellLambda.ps1  readme.txt

The generated template is quite simple but it gives you an idea how to start:

PS /home/dwe/Documents/aws/MyFirstPowershellLambda> cat ./MyFirstPowershellLambda.ps1
# PowerShell script file to be executed as a AWS Lambda function. 
# 
# When executing in Lambda the following variables will be predefined.
#   $LambdaInput - A PSObject that contains the Lambda function input data.
#   $LambdaContext - An Amazon.Lambda.Core.ILambdaContext object that contains information about the currently running Lambda environment.
#
# The last item in the PowerShell pipeline will be returned as the result of the Lambda function.
#
# To include PowerShell modules with your Lambda function, like the AWS.Tools.S3 module, add a "#Requires" statement
# indicating the module and version. If using an AWS.Tools.* module the AWS.Tools.Common module is also required.

#Requires -Modules @{ModuleName='AWS.Tools.Common';ModuleVersion='4.0.6.0'}

# Uncomment to send the input event to CloudWatch Logs
# Write-Host (ConvertTo-Json -InputObject $LambdaInput -Compress -Depth 5)

Just add the modules for the specific AWS services you want to work with in the “#Requires” section (you need to install them before of course) and write your script:

PS /home/dwe/Documents/aws/MyFirstPowershellLambda> cat ./MyFirstPowershellLambda.ps1
# PowerShell script file to be executed as a AWS Lambda function. 
# 
# When executing in Lambda the following variables will be predefined.
#   $LambdaInput - A PSObject that contains the Lambda function input data.
#   $LambdaContext - An Amazon.Lambda.Core.ILambdaContext object that contains information about the currently running Lambda environment.
#
# The last item in the PowerShell pipeline will be returned as the result of the Lambda function.
#
# To include PowerShell modules with your Lambda function, like the AWS.Tools.S3 module, add a "#Requires" statement
# indicating the module and version. If using an AWS.Tools.* module the AWS.Tools.Common module is also required.

#Requires -Modules @{ModuleName='AWS.Tools.Common';ModuleVersion='4.0.6.0'}
#Requires -Modules @{ModuleName='AWS.Tools.S3';ModuleVersion='4.0.6.0'}
#Requires -Modules @{ModuleName='AWS.Tools.EC2';ModuleVersion='4.0.6.0'}

# Uncomment to send the input event to CloudWatch Logs
# Write-Host (ConvertTo-Json -InputObject $LambdaInput -Compress -Depth 5)
Write-Output "Test"

The AWS documentation for the PowerShell Cmdlets is here.

Assuming that the script is completed (the above script does a simple print to the console) you need to deploy it to Lambda. For Python all you need to do is to zip your code and upload that to AWS Lambda. For PowerShell you need to call the “Publish-AWSPowerShellLambda” module passing in the script, a name for the Lambda function and the AWS region you want to have the function deployed to:

PS /home/dwe/Documents/aws/MyFirstPowershellLambda> Publish-AWSPowerShellLambda -ScriptPath ./MyFirstPowershellLambda.ps1 -Name MyFirstPowershellLambda  -Region eu-central-1

… and this will fail with:

Get-Command: /home/dwe/.local/share/powershell/Modules/AWSLambdaPSCore/2.0.0.0/Private/_DeploymentFunctions.ps1:544
Line |
 544 |      $application = Get-Command -Name dotnet
     |                     ~~~~~~~~~~~~~~~~~~~~~~~~
     | The term 'dotnet' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name,
     | or if a path was included, verify that the path is correct and try again.

Exception: /home/dwe/.local/share/powershell/Modules/AWSLambdaPSCore/2.0.0.0/Private/_DeploymentFunctions.ps1:547
Line |
 547 |          throw '.NET Core 3.1 SDK was not found which is required to b …
     |          ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
     | .NET Core 3.1 SDK was not found which is required to build the PowerShell Lambda package bundle. Download the .NET Core 3.1 SDK from
     | https://www.microsoft.com/net/download

The error message is quite clear: You need to install the “.NET Core 3.1 SDK” but as we added the Microsoft repositories above this is just a matter of (again, adjust for your package manager):

$ sudo apt-get install -y dotnet-sdk-3.1

Trying the same again and this time it succeeds:

PS /home/dwe/Documents/aws/MyFirstPowershellLambda> Publish-AWSPowerShellLambda -ScriptPath ./MyFirstPowershellLambda.ps1 -Name MyFirstPowershellLambda  -Region eu-central-1
Staging deployment at /tmp/MyFirstPowershellLambda
Configuring PowerShell to version 7.0.0
Generating C# project /tmp/MyFirstPowershellLambda/MyFirstPowershellLambda.csproj used to create Lambda function bundle.
Generating /tmp/MyFirstPowershellLambda/Bootstrap.cs to load PowerShell script and required modules in Lambda environment.
Generating aws-lambda-tools-defaults.json config file with default values used when publishing project.
Copying PowerShell script to staging directory
...
... zipping:   adding: Namotion.Reflection.dll (deflated 58%)
... zipping:   adding: System.Diagnostics.PerformanceCounter.dll (deflated 60%)
... zipping:   adding: MyFirstPowershellLambda.ps1 (deflated 53%)
... zipping:   adding: System.Management.dll (deflated 62%)
... zipping:   adding: Markdig.Signed.dll (deflated 62%)
... zipping:   adding: libpsl-native.so (deflated 69%)
...
Creating new Lambda function MyFirstPowershellLambda
Enter name of the new IAM Role:
dwe-tmp-role
...
Select IAM Policy to attach to the new role and grant permissions
    1) AWSLambdaFullAccess (Provides full access to Lambda, S3, DynamoDB, CloudWatch Metrics and  ...)
    2) AWSLambdaReplicator
...
1
Waiting for new IAM Role to propagate to AWS regions
...............  Done
New Lambda function created

Heading over to the AWS console we can see that the function is there:

Hope this helps…

Cet article Publishing a PowerShell script to AWS Lambda est apparu en premier sur Blog dbi services.

SQL Server: Generating SQL script using PowerShell and Template file

$
0
0

In this blog post, I will share with you a small PowerShell script I did recently.

I have noticed that my customer performs a very repetitive and time-consuming task almost every day.
New columns are added to tables on their business-critical database and they need to maintain SQL scripts file with all the ALTER TABLE statements for each new column.

For every new column, my customer copy-pastes the following SQL Script and then change parts of it.

/***********************************
*
* New column 
*            Schema:       Order
*            Table:        TestTable2     
*            Column:       ColumnName1    
*            
* History    
*            Date:         18/10/2020 
*            User:         Steven Naudet 
*
************************************/

IF NOT EXISTS (
       SELECT * 
       FROM sys.tables AS t 
       JOIN sys.[columns] AS c ON t.[object_id] = c.[object_id]
       JOIN sys.schemas AS s ON s.[schema_id] = t.[schema_id]  
       WHERE 1=1 
       AND s.name = 'Order'  
       AND t.name = 'TestTable2' 
       AND c.name = 'ColumnName1' 
) 
BEGIN 
       PRINT 'Altering table Order.TestTable2 adding column [ColumnName1]' ; 
       ALTER TABLE [Order].TestTable2 
       ADD 
       ColumnName1 NOT NULL; 
END 

/***********************************
*
* End New column ColumnName1  
*
************************************/

The highlighted lines are manually edited by my customer every time there’s a new column to be added to the database, which can occur 20 times per week.
I decided to write a PowerShell function to do this task faster so my customer can work on more interesting things instead.

The idea is to use a Template file for the SQL Script. The file is similar to the SSMS templates.
The PowerShell script modifies the template and as output sends the SQL to Clipboard using Set-Clipboard.
Consecutive calls to the function will add the SQL commands after one another in the Clipboard. This way my customer can just Paste the generated SQL script to his SQL source control tool.

You can see the script in action with the GIF below.

PowerShell Script in action GIF

Here is the script.

function New-AddColumnSQL {

    [CmdletBinding()]
    param (
        [Parameter(Mandatory=$true)][string] $Schema,
        [Parameter(Mandatory=$true)][string] $Table,
        [Parameter(Mandatory=$true)][string] $Column,
        [Parameter(Mandatory=$true)][string] $Type,
        [Parameter(Mandatory=$false)][string] $defaultValue,
        [Parameter(Mandatory=$false)][switch] $isNotNull = $false,
        [Parameter(Mandatory=$false)][string] $User = 'Steven NAUDET'
    )

    $TemplateFile = 'Z:\scripts\TemplateAddColumn.sql'

    $Clipboard = Get-Clipboard
    
    # Clear Clipboard if first call to the function
    if ($Clipboard -like '*Altering table*') {
        $returnMessage = 'SQL Script appended to Clipboard'
    } else {
        $returnMessage = 'SQL Script pasted to Clipboard'
        Set-Clipboard -Value $null
    }

    $ColumnDef = $Type

    # NOT NULL
    if($isNotNull) { 
        $ColumnDef = $ColumnDef + ' NOT'
    }
    $ColumnDef = $ColumnDef + ' NULL'

    # DEFAULT value
    if($defaultValue) { 
        $ColumnDef = $ColumnDef + ' DEFAULT ' + $defaultValue
    }

    $SQLscript = Get-Item -Path $TemplateFile | Get-Content
    
    $SQLscript = $SQLscript.Replace('<Date>', (Get-Date -UFormat "%d/%m/%Y"))
    $SQLscript = $SQLscript.Replace('<SchemaName>', $Schema)
    $SQLscript = $SQLscript.Replace('<TableName>', $Table)
    $SQLscript = $SQLscript.Replace('<ColumnName>', $Column)
    $SQLscript = $SQLscript.Replace('<UserName>', $User)
    $SQLscript = $SQLscript.Replace('<ColumnDefinition>', $ColumnDef)

    Set-Clipboard $SQLscript -Append

    return $returnMessage

}

There’s probably a lot of room for improvement for this code but the goal of this blog post is to show you how handy PowerShell can be. It can help you save a lot of time.
I took about 1 hour to write this code and I’m sure my customer will save more than that every month.

Cet article SQL Server: Generating SQL script using PowerShell and Template file est apparu en premier sur Blog dbi services.

An Introduction to Pester – Unit Testing and Infrastructure checks in PowerShell

$
0
0

Introduction

If you never heard of it, Pester is a PowerShell module, written in PowerShell.
It’s a framework for writing and running unit tests, integration tests, and also infrastructure checks as we will see in a moment.
Pester is used for example to test PowerShell Core and Pester itself.

In this blog post, I’ll do a short introduction to Pester with Installation and basic checks examples.

Installation

Pester is shipped by default with Windows 10 and Windows Server 2016. The version installed is 3.4.
The latest version is available in the PSGallery. It is currently version 5.1.
If you have the 3.4 version installed and would like to update it you will face errors with Update-Module. You need to use the following command to get the latest version:

PS C:\> Install-Module -Name Pester -Force -SkipPublisherCheck
PS C:\> Get-InstalledModule

Version    Name                                Repository           Description
-------    ----                                ----------           -----------
5.1.0      Pester                              PSGallery            Pester provides a framework for...

PowerShell function example

I will now show you a very basic Pester test.
Let’s say I want to write a Pester test for the following PowerShell function.

This is a very basic function that reverses the string characters. This is the output:

Now that I have a working function I can start to write the Pester test.

Create a Pester Tests file

The Pester function New-Fixture will create a template file for me but you could definitely create it yourself.
By convention, Pester test files should end with “Tests.ps1”.
The Tests file has been created.
I already edited the file and wrote the test.
This is what a Pester test looks like:

Pester basics

Pester is very declarative and easy to read.
You can ignore the first 3 rows in the Tests file, they came from the New-Fixture function and just dot sources the function to test into the PowerShell session.

The main commands with Pester are Describe, It, Context, and Should.

Describe is a block that contains tests. You will often have one Describe block for each function you want to test.
Context blocks are like Describe, they contain It blocks. They are optional and are useful to organize your test code.
The It block is the one that actually contains the test. The It block should have an expressive phrase describing the expected test outcome.
Finally, the Should command defines the test condition to be met. If the assertion is not met the test fails and an exception is thrown up. I used the -Be parameter. Many more are available like -BeFalse, -BeGreaterOrEqual, -BeLike, -Contain, etc.

In this example the test is simple. I set the expected value that I should get and I compare it to the actual value returned by the function.

Running Pester Tests

Now let’s run the test itself with Invoke-Pester.
Also, running Invoke-Pester without any parameter will run all the “Tests.ps1” files in the current location.
So, everything is green. We can see that one test was performed and it Passed.
Now let’s say another developer worked on the Get-ReverseString function and the latest change introduced a bug. Function behavior has changed and the Pester test will now throw a beautiful exception:
What is great is all the details (in red color) we can get when a test fails.

Infrastructure Testing

Pester is often used by sysadmin to do infrastructure testing. Your environment changes frequently and you need to be sure that your infrastructure is aligned with your standard.
Here are a few examples of such tests. The test is done directly inside a Pester code block not using a function like I did previously.

Check that my Windows Server Power Plan is set to High Performance.

Describe "Power Plan" {
    $PowerPlan = (Get-CimInstance -ClassName Win32_PowerPlan -Namespace 'root\cimv2\power' | Where-Object IsActive).ElementName
    It "Should be set to High Performance" {
        $PowerPlan | Should -be "High Performance" -Because "This Power Plan increases performance"
    }
}

The best practice for SQL Server disks is to have an allocation unit size of 64 KB, here is the check:

Describe "File Allocation Unit Size" {    
    $BlockSize = (Get-CimInstance -ClassName Win32_Volume | Where-Object DriveLetter -eq $SQLDisk).BlockSize
    It "Should be 64 KB" {
        $BlockSize | Should -Be 65536 -Because "It is recommended to set a File Allocation Unit Size value to 64 KB on partitions where resides SQL Server data or log files"
    }
}

Here I used the dbatools command Get-DbaErrorLogConfig to get the number of files configured for my ErrorLog. My best practice is to have 30 files instead of 6 by default.

Describe "SQL Server Error Log Files" {
    $errorLogCount = (Get-DbaErrorLogConfig -SqlInstance $SQLInstance).LogCount
    It "Should have Number of Log files set to 30" {
        $errorLogCount | Should -Be 30 -Because "Best practices requires 30 logs files to perform daily recycling"
    }
}

When all put together the output of the Tests looks like this:
As you can see I can easily validate that my SQL Server infrastructure is configured as expected.

Code Coverage

Code Coverage is the percentage of lines of code that is tested by unit tests.
It’s an indicator of how thoroughly your code has been tested. Having 100% coverage doesn’t mean that the code is bug-free, it just indicates that all your code is being executed during the test.

If I add some code to my Get-ReverseString.ps1 file, the -CodeCoverage functionality will tell me exactly what is not covered by tests:

Conclusion

This blog post was just to get you started on learning Pester. There are a lot more possibilities with Pester. I might cover some more advanced usages a future post like the TestDrive or Mocking.
Here are some resources I’d recommend:

You can find all the code from this blog post on GitHub.

Cet article An Introduction to Pester – Unit Testing and Infrastructure checks in PowerShell est apparu en premier sur Blog dbi services.

Validate your SQL Server infrastructure with dbachecks

$
0
0

Introduction

In this blog post, I’ll do an introduction to the PowerShell module dbachecks.
dbachecks uses Pester and dbatools to validate your SQL Server infrastructure.
With very minimal configuration you can check that your infrastructure is configured following standard best practices or your own policy.

We will see the following topics

– Prerequisites for dbachecks Installation
– Introduction to Pester
– Perform a Check
– Manage the Configuration items – Import & Export
– Output

– Power BI dashboard

Prerequisites for dbachecks Installation

The dbachecks module depends on the following modules:

  • dbatools
  • Pester
  • PSFramework

The easiest way to perform the installation is to do a simple Install-Module. It will get the latest dbachecks version from the PSGallery and install all the requires modules up to date.

I had many issues with this method.
The latest versions of PSFramework (1.4.150) did not seem to work with the current dbachecks version.
Installing the latest version of Pester (5.0.4) brings issues too.
When running a command I would get the following error:

Unable to find type [Pester.OutputTypes].
At line:219 char:9
+         [Pester.OutputTypes]$Show = 'All'
+         ~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : InvalidOperation: (Pester.OutputTypes:TypeName) [], RuntimeException
    + FullyQualifiedErrorId : TypeNotFound

To avoid this, prior to installing dbachecks, you should first install PSFramework with version 1.1.59.
Pester is already shipped with the recent versions of Windows with version 3.4.
If want to get a newer version, install manually version 4. Issues seem to come with version 5.

Set-PSRepository -Name "PSGallery" -InstallationPolicy Trusted

Install-Module PSFramework -RequiredVersion 1.1.59
Install-Module Pester -RequiredVersion 4.10.1 -Force -SkipPublisherCheck
Install-Module dbachecks

Here is what I got working:

Pester

dbacheks relies heavily on Pester. Pester is a framework that brings functions to build a unit-test for PowerShell code.
If you have don’t know what is Pester I’d recommend you read my introduction to Pester post here.

dbatools

The checks performed by dbatools are based on dbatools functions. If you didn’t tried dbatools yet I’d recommend you to have a look at dbatools’ repository and try a few commands.

Perform a Check

Now let’s talk about dbachecks. It’s is basically a set of Pester tests for your SQL Server infrastructure with code relying heavily on dbatools module.
Let’s look at the list of available “Checks” from dbachecks with Get-DbcCheck.

As you can see, they are currently 134 checks available covering a wide range of configurations you might want to check.

Let’s run a Check on an SQL Server instance. To do so we use the Invoke-DbcCheck command with the Check UniqueTag and the target Instance name.

This one checks for the database owner for all user databases of the instance. The default value for this check is configured to “sa”.
My check returned everything green. There’s only one database on this instance and its database owner is “sa”.

Check multiple instances

They are many ways to run checks against multiple instances.
You can define a list of instances in the config parameter with the command below. I’ll come to configuration elements in a minute.

Set-DbcConfig -Name app.sqlinstance -Value "server1\InstA", "localhost", "server2\instA"

Here I will use a CMS and the dbatools command Get-DbaRegisteredServer to get my list of instances. On the other instance, one of the databases got a non-“sa” database owner.
Maybe this owner is a valid one and I want to have this check succeed. We can modify the check configuration.

Check Configuration elements

All checks can have configuration elements.
To search in the configuration elements you can use Get-DbcConfig. I want to change the database owner’s name, I can search for all config items with names like “owner”.

The configuration values are also available with Get-DbcConfigValue.

So now, with Set-DbcConfig I can add a valid database owner to the ValidDatabaseOwner check.

Here is the output of the same check run again:

Of course, multiple tests can be run at the same time, for example:

Manage the Configuration items – Import & Export

We have seen how to use Set-DbcConfig to modify your checks configuration. You don’t need to change those configurations one by one every time you want to check your infrastructure.
All configuration items can be exported to a JSON file and imported back again.

I can set the configuration items as needed and then do Export-DbcConfig specifying the destination file:

# LastFullBackup - Maximum number of days before Full Backups are considered outdated
Set-DbcConfig -Name policy.backup.fullmaxdays -Value 7

# Percent disk free
Set-DbcConfig -Name policy.diskspace.percentfree -Value 5

# The maximum percentage variance that the last run of a job is allowed over the average for that job
Set-DbcConfig -Name agent.lastjobruntime.percentage -Value 20
# The maximum percentage variance that a currently running job is allowed over the average for that job
Set-DbcConfig -Name agent.longrunningjob.percentage -Value 20

# Maximum job history log size (in rows). The value -1 means disabled
Set-DbcConfig -Name agent.history.maximumhistoryrows -Value 10000

# The maximum number of days to check for failed jobs
Set-DbcConfig -Name agent.failedjob.since -Value 8

# The number of days prior to check for error log issues - default 2
Set-DbcConfig -Name agent.failedjob.since -Value 3

Export-DbcConfig -Path "$($HOME)\Documents\WindowsPowerShell\MorningCheck-Qual.json"

Here is the output of the Export-DbcConfig:

As you can guess imports of Config files are done with Import-DbcConfig.

Import-DbcConfig -Path "$($HOME)\Documents\WindowsPowerShell\MorningCheck-Qual.json"

Output

The Show parameter

The dbachecks output in the console gives a great level of details on what is going on. When you have thousands of checks running you might not want to get this wall of green text.
To show only the Failed checks you can use the -Show parameter of Invoke-DbcCheck with the value “Fails”.

Invoke-DbcCheck -Check ValidDatabaseOwner -Show Fails

If you want even fewer details, you can use -Show Summary.

XML files

Tests results can also be saved to XML files using the OutputFile parameter like this:

Here is an output example:

<?xml version="1.0" encoding="utf-8" standalone="no"?>
<test-results xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="nunit_schema_2.5.xsd" name="Pester" total="2" errors="0" failures="1" not-run="0" inconclusive="0" ignored="0" skipped="0" invalid="0" date="2020-12-14" time="15:29:47">
  <environment clr-version="4.0.30319.42000" user-domain="win10vm4" cwd="C:\Users\win10vm4admin\Documents\WindowsPowerShell\Modules\dbachecks\2.0.7\checks" platform="Microsoft Windows 10 Pro|C:\WINDOWS|\Device\Harddisk0\Partition4" machine-name="win10vm4" nunit-version="2.5.8.0" os-version="10.0.18363" user="win10vm4admin" />
  <culture-info current-culture="en-US" current-uiculture="en-US" />
  <test-suite type="TestFixture" name="Pester" executed="True" result="Failure" success="False" time="0.3166" asserts="0" description="Pester">
    <results>
      <test-suite type="TestFixture" name="C:\Users\win10vm4admin\Documents\WindowsPowerShell\Modules\dbachecks\2.0.7\checks\Database.Tests.ps1" executed="True" result="Failure" success="False" time="0.3166" asserts="0" description="C:\Users\win10vm4admin\Documents\WindowsPowerShell\Modules\dbachecks\2.0.7\checks\Database.Tests.ps1">
        <results>
          <test-suite type="TestFixture" name="Valid Database Owner" executed="True" result="Failure" success="False" time="0.2048" asserts="0" description="Valid Database Owner">
            <results>
              <test-suite type="TestFixture" name="Testing Database Owners on localhost" executed="True" result="Failure" success="False" time="0.1651" asserts="0" description="Testing Database Owners on localhost">
                <results>
                  <test-case description="Database dbi_tools - owner sa should be in this list ( sa ) on win10vm4" name="Valid Database Owner.Testing Database Owners on localhost.Database dbi_tools - owner sa should be in this list ( sa ) on win10vm4" time="0.0022" asserts="0" success="True" result="Success" executed="True" />
                  <test-case description="Database testDB - owner win10vm4\win10vm4admin should be in this list ( sa ) on win10vm4" name="Valid Database Owner.Testing Database Owners on localhost.Database testDB - owner win10vm4\win10vm4admin should be in this list ( sa ) on win10vm4" time="0.0043" asserts="0" success="False" result="Failure" executed="True">
                    <failure>
                      <message>Expected collection sa to contain 'win10vm4\win10vm4admin', because The account that is the database owner is not what was expected, but it was not found.</message>
                      <stack-trace>at &lt;ScriptBlock&gt;, C:\Users\win10vm4admin\Documents\WindowsPowerShell\Modules\dbachecks\2.0.7\checks\Database.Tests.ps1: line 172
172:                         $psitem.Owner | Should -BeIn $TargetOwner -Because "The account that is the database owner is not what was expected"</stack-trace>
                    </failure>
                  </test-case>
                </results>
              </test-suite>
            </results>
          </test-suite>
        </results>
      </test-suite>
    </results>
  </test-suite>
</test-results>

These XML files can be used to automate reporting with the tool of your choice.

Excel export

There’s a way to export the results to Excel. If you want to try it I’d recommend you to read Jess Pomfret’s blog post dbachecks meets ImportExcel.

Power BI dashboard

Checks can be displayed in a beautiful PowerBI dashboard.

The Update-DbcPowerBiDataSource command converts results and exports files in the required format for launching the Power BI command Start-DbcPowerBI.

The Update-DbcPowerBiDataSource command can take an “Environnement” parameter which is useful to compare your environments.
Here is an example of how it can be used.

Import-DbcConfig -Path "$($HOME)\Documents\WindowsPowerShell\MorningCheck-Qual.json"
Invoke-DbcCheck -Check ValidDatabaseOwner, ErrorLogCount `
    -Show Summary -Passthru | Update-DbcPowerBiDataSource -Environment 'Qual'

Import-DbcConfig -Path "$($HOME)\Documents\WindowsPowerShell\MorningCheck-Prod-Listener.json"
Invoke-DbcCheck -Check ValidDatabaseOwner, ErrorLogCount `
    -Show Summary -Passthru | Update-DbcPowerBiDataSource -Environment 'Prod'

Start-DbcPowerBi

The dashboard.

Conclusion

From my experience, dbatools use amongst DBA has grown a lot recently. Likewise, I think dbacheck will be used more and more by DBAs in the years to come.
It’s easy to use and can save you save a lot of time for your Daily/Weekly SQL Server checks.

This blog post was just to get you started with dbachecks. Do not hesitate to comment if you have any questions.

Cet article Validate your SQL Server infrastructure with dbachecks est apparu en premier sur Blog dbi services.

Viewing all 38 articles
Browse latest View live