Skip to main content

PowerShell : Retry logic in Scripts

One of my projects required me to copy a CSV file (important step) to a VM running on Server 2012 R2.
I’ve found this excellent tip by Ravi on using Copy-VMfile cmdlets in Server 2012 R2 Hyper-V. To use this cmdlet, I had to enable "Guest Service Interface" component in the Integration Services (below is what documentation says about the service).

This new component in the Integration Services allows copying files to a running VM without any network connection (How cool is that?).

The tip mentioned earlier talks about how to enable the component using Enable-VMIntegrationService, but there is a delay between enabling the component and successfully using the Copy-VMfile cmdlet. 

So how do I go about making sure that the service is running before the cmdlet is issued, or keep retrying the cmdlet until it succeeds ?





Simplest way would be to use Start-Sleep to induce a delay and take a guess that the service would be running by the time cmdlet executes, like done in below function definition:

001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
021
022
023
024
025
026
027
028
029
030
031
032
033
034
035
036
037
Function Copy-ImportantFileToVM {
    [CmdletBinding()]
    param($VMName)

        $VM = Get-VM -Name $VMName
        #Check if Guest Integration Service is enabled
        $GuestService = $VM.VMIntegrationService.Where({$_.Name -eq 'Guest Service Interface'})
        if (-not $GuestService.Enabled) {
            #Enable the GSI
            $GuestServiceStatus = $VM | Get-VMIntegrationService -Name "Guest Service Interface" | Enable-VMIntegrationService -Passthru
            if (-not $GuestServiceStatus.Enabled) {
                throw "Couldn't enable Guest Service Interface"
            }
        }    
        # Induce sleep in the script for 120 seconds just to be sure
        Start-Sleep -Seconds 120
        # Critical Step -> Copy test CSV to VM
        if (Test-Path -Path "$PSScriptRoot\test.csv") {
           TRY {
                $CopyFileHash = @{
                    Name=$VMName;
                    SourcePath="$PSScriptRoot\test.csv";
                    DestinationPath='C:\temp\test.csv';
                    FileSource='Host';
                    CreateFullPath=$true;
                    Force = $true;
                    Verbose=$true;
                    ErrorAction='Stop';
                }
                Copy-VMFile  @CopyFileHash
                }
            CATCH {
                # Put error handling here - maybe log it
                $PSCmdlet.ThrowTerminatingError($PSItem)
            }
        } # end if
}

That brings to another question-- what if the delay put is not enough or is it too much?

Similarly, a more practical use case is for the Azure cmdlets which make the REST API calls behind the scenes, what if while calling one of the REST endpoint the network fluctuated and the cmdlet failed at a critical step. 

Bottom line is I am really looking for a retry logic within my scripts, so that my code retries an important step for few times before it dies out.

While researching around the topic found this article by Pawel & this article by Alex on retry logic in PowerShell. Using these as a reference, I wrote the below function named Invoke-ScriptBlockWithRetry


001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
021
022
023
024
025
026
027
028
029
030
031
032
033
034
035
036
037
038
039
040
041
042
043
044
045
046
047
048
049
050
051
052
053
054
055
056
057
058
059
060
061
062
063
064
065
066
067
068
069
070
071
072
073
074
075
076
077
078
079
080
081
082
083
084
function Invoke-ScriptBlockWithRetry
{
<#
.Synopsis
   Invokes a script block with resiliency.
.DESCRIPTION
   The function takes a script block as a mandatory argument and tries to run it certain number of times (argument to -MaxRetries).
   It delays execution between subsequent[AN1] [DD2] retries by 10 seconds (default), can be passed a custom value to –RetryDelay parameter.
.EXAMPLE
   First create a script block with -ErrorAction set to Stop and then pass it to the function
   PS> $CopyLambda = {Copy-Item -Path \\fileserver\Info\test.csv -Destination C:\Temp -ErrorAction Stop}
   PS> Invoke-ScriptBlockWithRetry -Command $CopyLambda1 -MaxRetries 5 -Verbose
.EXAMPLE
   Script blocks have access to the current scope variables, so if you set a variable in the current scope, you can use that within the script block
   PS> $name = 'notepad'
   PS>Invoke-ScriptBlockWithRetry -Command {Get-Process -Name $name -EA Stop} -MaxRetries 5 -Verbose

.NOTES
   Credits
   Inspired by -
        1. http://www.pabich.eu/2010/06/generic-retry-logic-in-powershell.html
        2. http://www.alexbevi.com/blog/2015/02/06/block-retry-using-powershell/
#>

    [CmdletBinding()]
    [OutputType([PSObject])]
    Param
    (
        # Param1 help description
        [Parameter(Mandatory=$true,
                   ValueFromPipelineByPropertyName=$true,
                   Position=0)]
        [System.Management.Automation.ScriptBlock]
        $ScriptBlock,

        # Number of retries. Default is 10.
        [Parameter(Position=1)]
        [ValidateNotNullOrEmpty()]
        [int]$MaxRetries=10,

        # Number of seconds delay in retrying. Default is 10 seconds.
        [Parameter(Position=2)]
        [ValidateNotNullOrEmpty()]
        [int]$RetryDelay=10
    )

    Begin
    {
        Write-verbose -Message "[BEGIN] Starting the function"
        $currentRetry = 1
        $Success = $False
    }
    Process
    {
        do {
           try
            {
                Write-Verbose -Message "Running the passed script block -> $($ScriptBlock)"
                $result = & $ScriptBlock # invoke the script block
                $success = $true
                Write-Verbose -Message "Script block ran successfully -> $($ScriptBlock)"
                return $result
            }
            catch
            {
                $currentRetry = $currentRetry + 1              
                Write-Error -Message "Failed to execute -> $($ScriptBlock) .`n Error-> ($_.Exception)"  # Write non-terminating error for allowed retries
    
                if ($currentRetry -gt $MaxRetries) {         
                    # If the current try count has exceeded maximum retries, throw a terminating error and come out. In place to avoid an infinite loop 
                    Write-Warning -Message "Could not execute -. $($ScriptBlock).`n Error: -> $($_.Exception)"
                    $PSCmdlet.ThrowTerminatingError($PSitem) # Raise the exception back for caller. This is a terminating error as the retries have exceeded MaxRetries allowed.
                }
                else {
                    Write-verbose -Message "Waiting $RetryDelay second(s) before attempting again"
                    Start-Sleep -seconds $RetryDelay
                }
            }
        } while(-not $Success) # Do until you succeed
    }
    End
    {
        Write-verbose -Message "[END] Ending the function"
    }
}

Now the trick to using this function in your scripts is to pass it a script block with -ErrorAction set to Stop for the steps you think can probably fail and you want them to be retried.

Let’s rewrite our Copy-ImportantFileToVM function using the Invoke-ScriptBlockWithRetry :


001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
021
022
023
024
025
026
027
028
029
030
031
032
033
034
035
036
037
038
Function Copy-ImportantFileToVM {
    [CmdletBinding()]
    param(
        [Parameter(Mandatory)]
        [ValidateNotNullOrEmpty()]
        [String]$VMName
        )

        $VM = Get-VM -Name $VMName
        #Check if Guest integration Service is enabled
        $GuestService = $VM.VMIntegrationService.Where({$_.Name -eq 'Guest Service Interface'})
        if (-not $GuestService.Enabled) {
            #Enable the Guest Integration Service
            $GuestServiceStatus = $VM | Get-VMIntegrationService -Name "Guest Service Interface" | Enable-VMIntegrationService -Passthru
            if (-not $GuestServiceStatus.Enabled) {
                throw "Couldn't enable Guest Service Interface"
            }
        }

        $CopyFileHash = @{
                            Name=$VMName;
                            SourcePath="$PSScriptRoot\test.csv";
                            DestinationPath='C:\temp\test.csv';
                            FileSource='Host';
                            CreateFullPath=$true;
                            Force = $true;
                            Verbose=$true;
                            ErrorAction='Stop';
                        }
        # Copy test CSV to VM
        if (Test-Path -Path "$PSScriptRoot\test.csv") {
            # Critical step to copy the CSV, I want it to be retried
            Invoke-ScriptBlockWithRetry  { Copy-VMFile @CopyFileHash }
            # Non-critical step, just an example -> I don't care if notepad is running
            Get-Process -Name notepad -ErrorAction SilentlyContinue
        } # end if

}



If you have a good eye, you would have noticed the non-critical step placed in the script block. I just put it to show that it is possible to have steps within your script block which you don’t care if they throw an exception (-ErrorAction SilentlyContinue will suppress error messages for the non-critical step).

Note that I can also pass the number of maximum retries to be done along with wait interval (in seconds) between the retries. Now you can be very creative and extend this as per your needs in various scenarios.

Have fun exploring!

Popular posts from this blog

Test connectivity via a specific network interface

Recently while working on a Private cloud implementation, I came across a scenario where I needed to test connectivity of a node to the AD/DNS via multiple network adapters.  Many of us would know that having multiple network routes is usually done to take care of redundancy. So that if a network adapter goes down, one can use the other network interface to reach out to the node. In order to make it easy for everyone to follow along, below is an analogy for the above scenario: My laptop has multiple network adapters (say Wi-Fi and Ethernet) connected to the same network. Now how do I test connectivity to a Server on the network only over say Wi-Fi network adapter?

PowerShell + SCCM : Run CM cmdlets remotely

Today I saw a tweet about using implicit remoting to load the Configuration Manager on my machine by Justin Mathews . It caught my eye as I have never really tried it, but theoretically it can be done. Note - The second tweet says "Cannot find a provider with the name CMSite", resolution to which is in the Troubleshooting section at the end.

PowerShell : Trust network share to load modules & ps1

Problem Do you have a central network share, where you store all the scripts or PowerShell modules ? What happens if you try to run the script from a network share ? or if you have scripts (local) which invoke scripts or import PowerShell modules stored on this network share ? Well you would see a security warning like below (Note - I have set execution policy as 'Unrestricted' not 'bypass' here): Run a .ps1 from the network share Well this is a similar warning, which you get when you download scripts from Internet. As the message says run Unblock-File cmdlet to unblock the script and then run it, let's try it.