Exchange servers can accumulate a lot of IIS log files over time. Some administrators configure IIS to store logs on a different disk to avoid problems, while others just wait for free disk space alerts and manually remove old logs from time to time.

I found a number of PowerShell scripts online for automating the cleanup of IIS log files, but none were an exact match for what I wanted. I had a few objectives for this script:

  • No external dependencies (many scripts rely on command line versions of zip utilities like 7-Zip)
  • Must compress log files into monthly archive zip files, not just delete them
  • Must have the ability to store the zip files in a central archive location

So I wrote IISLogsCleanup.ps1, which I am making available for download here.

Download the script from the TechNet Script Gallery or Github.

How to Run IISLogsCleanup.ps1

Please test the script on a non-production server first or at least make sure you have backed up your IIS log files before trying this script for the first time.

The script takes two parameters:

  • Logpath – this is a mandatory parameter to specify the path to the IIS logs you want to clean up, such as “D:IIS LogsW3SVC1”
  • ArchivePath – this is an optional parameter to specify the path to the central archive location, such as “\\nas01\archives\iislogs”

When you run IISLogsCleanup.ps1 it performs the following:

  • Calculates the first day of the previous month (so there will always be at least 1 month of retained logs)
  • Zips up log files from before the first day of the previous month into zip files per month
  • Verifies the results of the zip action and removing the log files if safe to do so
  • Optionally, moves the zip file to the central archive location
  • Writes a log file of progress and actions taken

Examples:

.\IISLogsCleanup.ps1 -Logpath "D:\IIS Logs\W3SVC1"

.\IISLogsCleanup.ps1 -Logpath "D:\IIS Logs\W3SVC1" -ArchivePath "\\nas01\archives\iislogs"

iislogscleanup

Scheduling IISLogsCleanup.ps1

To run the script as a scheduled task use the following task settings (replace server names and file paths as necessary):

  • Run whether user is logged on or not
  • Triggers: I recommend the first day of each month
  • Action: Start a program
    • Program: powershell.exe
    • Arguments: -command “C:\Scripts\IISLogsCleanup.ps1 -LogPath ‘C:\inetpub\logs\LogFiles\W3SVC1’ -ArchivePath ‘\\ho-mgt\iislogbackups'”

To run the task with a least privilege service account the account needs:

  • Rights to “Log on as a batch job” for the local server
  • Read/write access to the IIS logs directory
  • Write access to the archive location
  • Read/write/execute to the location where the script is running from

Download the script from the TechNet Script Gallery or Github.

As always if you have any questions or feedback please leave a comment below.

Updated 8/8/2015 – thanks to Rob and Alain in the comments for the suggestions for regional date format issues and zip file locking issues.

About the Author

Paul Cunningham

Paul is a former Microsoft MVP for Office Apps and Services. He works as a consultant, writer, and trainer specializing in Office 365 and Exchange Server. Paul no longer writes for Practical365.com.

Comments

  1. Noe L

    Hi friends.

    Doubt these files if I delete them, what happens?

    In theory I know that they are the connections made by the devices that have institutional mail configured …

    They do not help me to make a tracking log, that is, to look for connections as history?

    If they are deleted, it would no longer have that history, right?

    I erase them hot for example from 3 months ago and left the last 3 months, in theory not under any service.
    your opinion would be appreciated

  2. Marco Casella

    Great script, work like a charm… but not on Windows Core Server 🙁

    I think this part:

    $shellApplication = new-object -com shell.application
    $zipPackage = $shellApplication.NameSpace($zipfilename)

    has null value.

    You cannot call a method on a null-valued expression.
    At C:\edok\IISLogsCleanup.ps1:280 char:9
    + $zipPackage.CopyHere($fn,16)
    + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo : InvalidOperation: (:) [], RuntimeException
    + FullyQualifiedErrorId : InvokeMethodOnNull

    Some hint on how to circumvent the problem?

    great thanks in advance
    Marco

  3. Lucien

    What if I just want to use a static logpath instead of the parameter asking for it. I want to use it for automatic scripting every month instead of doing it by hand.

  4. Lucien

    What if I just want to use a static logpath instead of the parameter asking for it. I want to use it for automatic scripting every month instead of doing it by hand.

  5. Robert

    Hello guys. I run into some problems regarding this ps script. First of all Paul thank you so much for providing us this genius cleanup script :). Well I run into a problem and I’m not able to understand why its happening. The script is working fine but on some log files I get the error message and step immediately into the ELSE statement – “Zipped file count does not match log file count, not safe to delete log files”. I tried to debug and as far as I can tell the variable $zipfiles does not receive any input so the output is 0. Increasing the sleeptime does not solve the problem. So once I step into if statement ($zippedcount -eq $($zipfiles.Count)) is 1 not eq to 0 and I get into ELSE. To make things more confusing, if I run the script locally on my desktop and everything is fine and all the logs are compressed and deleted as expected, but once the script is deployed on 2012 R2 or 2008 server some of the logs (the same logs that I copied for testing purpose locally) are zipped but not removed do to the count mismatch. I assumed it’s a security policy but I tried everything: noprofile, ExecutionPolicy – Bypass, redirect path to powershell 32 bit version, granting system user local policy right to take ownership of files. My user is in local administrator group so it has full control rights on the server. Any help would be very appreciated.

    1. Rob

      Well I was able to reproduce the problem. If there is only one log in a month i.e. u_ex190125 then I get the error but if there are > 1 logs in a month its working. How could I override the issue ?

    2. Mirco Palandri

      Hi guys,
      you get the “Zipped file count does not match log file count, not safe to delete log files” if you have a single file in zip.

      that happens because, when you assign the filtered $hashtable variable value (that contains a single DictionaryEntry object) to $zipfiles, $zipfiles becomes a DictionaryEntry object instead of an array object; then when you try to count on a DictionaryEntry the result will be: nothing 🙂

      RESOLUTION: add an [array] before the $zipfiles variable declaration:

      [array]$zipfiles = $hashtable | Where {$_.Value -eq “$($date.Name)”}

      I hope it helps

      Cheers,

      Mirco

  6. Matt

    I’ve got an odd issue going on. What I’ve found is the script will run, and if it finds multiple months worth of logs, it will zip them based off of month and delete as expected, but when it gets to the previous month, it will add the first of that month to a zip and delete the log, leaving the rest of the months logs in the directory. Then the next time the script runs it fails because it’s missing one day and the logs don’t add up to what the script is expecting. I did try changing $_.CreationTime to $_.LastWriteTime on line 233 thinking it was an issue with how W3C handles time, but that doesn’t seem to have changed anything.

    1. Matt

      Sorry, line 223 not 233.

    2. matt

      Turns out it was because of how W3C handles time. When using Get-Date, it used the exact time the script is run as its variable. Depending on when the log files were created this was causing my issue. I ended up adding .date to the end of line 113, this forces Get-Date to use 12 AM as its time instead of the local clock. This in addition to changing $_.CreationTime to $_.LastWriteTime seems to have solved my issues.

  7. William

    Hi Paul,

    I come up with a problem that when the zip file reaches about 5GB during compression, it shows a pup up window tell me the zip file is damaged. I tried three times on two different servers with administrator, all end up with same result.

    I read all about four years comments in this post and didn’t find a similar situation. Do you have any idea about this.

    With many thanks.

    1. Paul Cunningham

      With that much data it may be worth customizing the script to zip the up per day instead of per month.

  8. Ashfaq

    Hi Paul,

    Script works great but I noticed one thing every monthly archive is having one file from previous month.

    like archive for October 2017 is missing file for 31 October & having file for 30 Sep.

    1. Paul Cunningham

      I believe that is due to the script looking at the last write time for the file, not the time that it was created. A file created on 30th September can have a last write time of 1st October.

  9. Imi

    Hi

    I have an issue with the scheduled task.
    The problem is that the task shown as finished in the Scheduled Task console, but in the meantime the powershell process is running in the task manager and the zip file is created, but hasn’t been copied to the target location.

    When I kill the powershell process, I can copy the zip file manually, and it contains the iis logs from the right time period.
    I dont know what could be the problem. The scheduled task is running with an local administrator domain account and configured as hidden with highest privileges. The created zip file size is around 4GB.

    1. Paul Cunningham

      Could be permissions. Make sure the service account has permissions to write to wherever you’re trying to send the Zip file. Or run it with a more powerful account to test it.

  10. Kris

    Oh the other thing is that I changed it to include all folders underneath the log path (using the Recurse parameter). Since subfolders could have files in them that have the same name (since the log file names are based on the date) I changed the line that adds the log files to the zip file to include so to include the full path to the log file.

    i.e. I changed this line (around line number 280):

    $zipPackage.CopyHere($fn,16)

    to

    $zipPackage.CopyHere($fn.FullName,16)

  11. Kris

    Here’s how to manually specify a date (from which you want to remove the logs created before):

    $mydate=[datetime]::ParseExact(“01/11/17”, “dd/MM/yy”, $null)

    $logstoremove = Get-ChildItem -Path “$($Logpath)” -Include *.log -Recurse | Where {$_.CreationTime -lt $mydate -and $_.PSIsContainer -eq $false}

  12. vilas

    Hi Paul

    Can i have an option to set the date from and till which i have to archive the logs .

    I want to archive logs from last 15 days

  13. G.Kracht

    Hey Paul,

    I don’t read all the comments above me, but if my Question is already asked you can point me to it.

    I need to scan the subfolders of my logs-folder.
    There is a foldertree like LOGS—-contains—>http and service—-service contains—->2-6 services—->containing logs.

    Can you help me or had any idea?

    Greetings

  14. Priyam Moitra

    Hi Paul,

    Can I pass input though a txt file :

    $computername = C:ServerList.Txt ?

    Will it work ?

    Thanks
    Priyam

  15. Priyam Moitra

    HI Paul,

    thanks for your script .

    Could you please help me , to add multiple servers and run the script from one dedicated server.

    For example , we want to execute the command from Server 1 , and it will clean for Server 1 , server 2 …. Server n.

    Thanks
    Priyam

    1. Paul Cunningham

      The script is designed to run on the local server, not run against remote servers. What you’re asking for is possible, but it would be more than a simple task to rewrite the script to work like that. You’re welcome to take the script and customize it for your own needs.

  16. Vaibhav

    Hi Paul,

    I would like to run this script to clear previous days logs (basically 1 day old log instead of 1 month old). Please let me know what changes I need to do in the script.

    Many thanks in advance.

    Regards,
    Vaibhav

  17. selvaraj manoharan

    Hi,
    i need this script for multiple server iis log file zip.

    Regards,
    Selvaraj. M | India

  18. Aron Boner

    Hi Paul
    Great script but I have a question:
    Why it creates a new file for the next month and place 2 files init (last day from month before and first date of last month)

    I created a scheduled task that run every first day per month.
    Example the run from 05/01/2016 create the ZIP mailservername-W3SVC2-2016-03.zip with all the files and then it creates a new file mailservername-W3SVC2-2016-04.zip and put the two files in it. When the script starts 06/01/2016 then the failure is that the file mailservername-W3SVC2-2016-04.zip exists and the failure ist can’t move…

    Thank you for your help
    Aron

    1. Paul Cunningham

      The script looks at the last write time of the file. I haven’t tested this, but it’s possible your IIS log file rollover settings are set in a way that causes what you’re seeing there.

  19. Paul

    Doh! Scratch that I’ve just read the article properly and can see you specify the logfiles folder in the command.

    Thanks,
    Paul

  20. Paul

    Hi Paul,

    Does the script clean up both WS3VC1 adn WS3VC2 as we have both folders under the logfiles folder?

    Thanks,
    Paul.

  21. Ari

    excellent Thank you

  22. Bezu

    Hi Paul,

    Thanks for this wonderful script. I just need a little insight.. Which variable i have to change to detect and compress the logs older than 15 days instead of previous month?

    1. Paul Cunningham

      There’s no variable in the script for that. You’d need to do some re-writing of the logic around how it selects log files to archive.

  23. Bill

    The script is working for me except the -ArchivePath. Nothing is placed in the folder.
    This is my arguments
    -command D:workareaScriptsIISLogsCleanup.ps1 -LogPath “C:inetpublogsLogFilesW3SVC3” -ArchivePath “\yoweb4Logs”

    I have a share called Logs.

    Thanks.
    Bill

    1. Bill

      I am running this on 2012 R2 and configure for Windows Server 2012 R2.

      1. Paul Cunningham

        The script writes a log file, have you checked it for any errors or clues as to why the Zip files aren’t being created or moved to the archive path?

        1. Bill

          The zip files are created but not in the Archive Folder. I will do more checking.

          Thanks.
          Bill

        2. Bill

          I found the problem. It was a permission problem to the archive location.

          Thanks.
          Bill

  24. Ryan

    Hey guys – I have been playing with this script and its great!.
    However the only issue I face is – when I run the script manually it works great

    but when I run it via task sched it never finds anything , so the output below it seems to be looking for logs older than Sept but doesnt find any

    When the current month is october

    10/04/2015 08:56:59 =====================================
    10/04/2015 08:56:59 IIS Log File Cleanup Script
    10/04/2015 08:56:59
    10/04/2015 08:56:59
    10/04/2015 08:56:59 Current Month: 10/15
    10/04/2015 08:56:59 Previous Month: 09/15
    10/04/2015 08:56:59 First Day of Previous Month: 09/01/15
    10/04/2015 08:56:59 Found 0 logs earlier than 09/01/15
    10/04/2015 08:56:59 Finished

    1. Ryan

      Found the issue- my fault

      Forgot to enable ” run with highest privileged option” in the task sched

  25. Kiran

    How I can customize this script, so it can run for current month (first day of the month up to the midnight of the previous day )?

  26. Jeeva

    My bad – got it.. thank you

  27. Jeeva

    im not a powershell guy but i tried to setup the script for the environment but I got the belwo error

    cmdlet logzip.ps1 at command pipeline position 1
    Supply values for the following parameters:
    Logpath: “D:inetpublogstest”
    WARNING: Log path “D:inetpublogstest” not found
    PS D:DeploymentBackup>

  28. Stephen

    I wish there was way to further automate this such as:

    Making changes to this script to do all W3SVCxx folders with logs in there over 30 days.

    What if a new website is created. I don’t want to have to manually hunt for new folders every time a new website is created… just to manage my IIS logs.

    Does this script have any built in logic to do all .log files in any folder under inetpublogsLogFilesnamed W3SVCxx? So that it encompasses new websites created.

      1. Sean

        Hi Paul,

        Your script is amazing! The archive routine is beautiful — it’s exactly what I’ve been looking for. However, I’m also struggling to get it to recurse through all of the directories in “C:\inetpub\logs\LogFiles\*” (all of the W3SVC* directories).

        Any information you can pass along would be greatly appreciated.

        Thanks,

        Sean Snow

  29. Paul

    Hi Paul

    Thanks for sharing this script.

    I have not problem executing the scripts and it does exactly what I was looking to.as a scheduled task. I have created the task

    The issue I have is when running the powershell script (using 2012R2).
    I have created the task as the following but nothing is happening:

    Is somenthing very basic that I am missing?

    Action: Start a program

    Program: powershell.exe
    Arguments: -command “C:ScriptsIISLogsCleanup.ps1 -LogPath ‘C:inetpublogsLogFilesW3SVC1’ -ArchivePath ‘\mysahredpathbackup’”

    1. Paul Cunningham

      At the end of the blog post I wrote a series of points that need to be checked to ensure the script will run correctly as a scheduled task. Have you checked all of those?

      The script also writes a log file, so you can look at that to see if it shows any clues.

    2. Joseph

      I had the same issue as you when trying to run the script from a scheduled task on a Win 2008 R2 machine. The task would run and complete successfully within a few seconds but the script would never execute. I had to remove the single ticks and use double quotes around the path parameters only as below.

      This worked for me:

      Program: powershell.exe
      Arguments: -command C:workareaScriptsIISLogsCleanup.ps1 -LogPath “E:iis-logsW3SVC3” -ArchivePath “\mysharedpathIIS-LOGS”

      I hope this helps anyone else having the same issue running through Windows Task Scheduler.

      The script is awesome, btw!!

  30. Michael McGuinness

    I had also been going crazy as to why it would not find my files, I had 100’s of them.

    I changed line 205 from “Where {$_.CreationTime” to “Where {$_.LastWriteTime” and now it works perfectly?

    Not sure if this is an issue with different versions of powershell, I am running version 4.

    1. Paul Cunningham

      What happens if you just open PowerShell and run the Get-Childitem command manually? Does it return results when you filter on CreationTime? If you Get-ChildItem one specific file and inspect the attributes does anything stand out different?

      1. Michael McGuinness

        I see your point Paul, here is what I can get:

        CreationTime : 16/07/2015 10:00:00 AM
        CreationTimeUtc : 16/07/2015 12:00:00 AM
        LastAccessTime : 24/08/2015 10:18:17 AM
        LastAccessTimeUtc : 24/08/2015 12:18:17 AM
        LastWriteTime : 17/07/2015 10:00:00 AM
        LastWriteTimeUtc : 17/07/2015 12:00:00 AM

        Nothing is standing out for me, however what I note is that this is one of the files that should not be zipped based on it’s datetime stamp, however I can’t be sure what my old files would of been.

        Overall I can now say that I have put this script on a few servers and they are all working really well using LastWriteTime, so maybe this might just be more consistant with other files.

        1. Michael McGuinness

          Yep same file, I would say that since I made those changes everything works very well 🙂 So I am happy!

  31. Michael McGuinness

    Hi Paul,

    Great job on this script, I can across it via the microsoft site, I have got it to work but found a few slight issues:

    1) Line 178, I think there is a typo, it currently shows “Write-Logfile ” $no”” , The variable being called is $now? I changed it and now it shows the date in the log

    2) I had massive issues understanding the date issue here, It was so weird that on all my AU machines (Servers and Workstations) the date it was showing on the write-host lines was as the US format (not sure if this matters, but it would be good to show the correct format in the logs).

    Output of log entries:

    08/25/2015 13:02:52 =====================================
    08/25/2015 13:02:52 IIS Log File Cleanup Script
    08/25/2015 13:02:52
    08/25/2015 13:02:52
    08/25/2015 13:02:52 Current Month: 8
    08/25/2015 13:02:52 Previous Month: 7
    08/25/2015 13:02:52 First Day of Previous Month: 07/01/2015 13:02:52
    08/25/2015 13:02:52 Found 31 logs earlier than 07/01/2015 13:02:52

  32. Tony

    If you don’t like scripting and prefer a gui checkout iislogs.com. You also get monitoring, it autoconfigures, zips and moves your logs files to another location. $30 well spent.

    1. Bill

      That is $30 per server.

      Bill

  33. Paul Cunningham

    I have moved the script to TechNet and Github and updated the download links, so anyone having trouble downloading should be able to get it now.

    Rob and Alain, thanks for your feedback and fixes on those date/time and zip file lock issues. I have added your fixes into the script and credited you. If you’ve got a blog or Twitter handle you’d like included in the credit please let me know.

  34. Bill

    I am getting a Not Found 404 error when trying to download the script.

  35. Demetrius

    The download link is not working. 🙁

  36. Bob Endicott

    I have to say this this script is awesome. I wasn’t necessarily looking to clean up Exchange logs but will be using for this as well. I had to combine Alain’s IsFileLocked function, Rob’s fix for US date and Paul’s script. I added email notification with the log attached so I do not have to check the servers. This script is a lifesaver and will save be hours of time.

    1. Paul Cunningham

      Thanks for the feedback. I really should get around to adding Alain and Rob’s fixes to the script 🙂

  37. Atif Mushtaq

    Can any one pls guide me or change the same script so that I can just keep current day IIS log on same path and archive all of the rest

    thank you

  38. Alex

    We are having issues accessing the zip-Folders with different users that all are local admins and have full access to the parent folders. The inheritance of the permissions does not work correctly.
    Any idea how to avoid this?

  39. Thor

    And if you only want to keep the logs for the current month (current), that change should I do?
    This configuration is valid to Spain (dd / mm / yyy) ??

  40. Martijn Westera

    Hi Paul,

    I got an error when testing: powershell.exe -command “D:ScriptsIISLogsCleanup.ps1 -LogPath ‘C:inetpublogsLogFilesW3SVC1´”

    The string is missing the terminator: ‘.

    Solution:
    powershell.exe -command “D:ScriptsIISLogsCleanup.ps1 -LogPath “C:inetpublogsLogFilesW3SVC1””

    1. Martijn Westera

      I think it was a syntax error on my side.
      “D:ScriptsIISLogsCleanup.ps1 -LogPath ‘C:inetpublogsLogFilesW3SVC1´” is working now

  41. Eddie

    Hoi Paul,

    Can this script also be used to clean up the “C:Program FilesMicrosoftExchange ServerV14Logging” folders? Like, EWS, Addressbook Service, RPC Client Access etc etc?
    Or is there a reason one cannot clean up these folders or should never clean them up?

  42. jackr

    It is pathetic that MS does not implement automatic log file clean up.

    As usual MS doing a half job when making software.

  43. Hoang NGUYEN

    I have many sizes of IIS LogFiles, some are small but others are big.
    For the small ones, The Zip is success for these.
    For big ones, It fails to Zip with an error:
    Can not read or not access

    After overviewing some comments, I stare at $sleepinterval parameter.
    So how many do I need to update the $sleepinterval for logs around 800MB each file?

  44. Richard

    This script works wonders… I was wondering if there is away to get the log file to be called after the logfilepaths last folder name.

    So something like IISLogsCleanup_W3svc1.log

    On some of my servers I have 4 different log paths and would like a log file for each but still just use one script. I have setup 4 different scheduled tasks and am happy with that.

    Hope that makes sense and someone could help!

    Thanks

    1. Richard

      I sorted this by adding the below code to the script:

      $CleanupLog = “c:path to where I want log files”

      $pos = $Logpath.lastIndexOf(“”)

      $leftPart = $Logpath.Substring(0, $pos)
      $rightPart = $Logpath.Substring( $pos , $Logpath.Length – $pos )

      #$myDir = Split-Path -Parent $MyInvocation.MyCommand.Path
      $output = $CleanupLog +$rightPart +”_IISLogsCleanup.log”

  45. Clint

    Thanks for the script Paul! Had to make the US date mod, but otherwise just what I needed. If you do any updates, I’d suggest ISO yyyy-mm-dd format.

  46. wasim

    Hi paul this script work fine.
    i tried to modify it to work for files less than a month but it gets messed up.
    can you guide me.

  47. David Taig

    Finding I need to update the $sleepinterval to 75 for logs around 85MB a day for Server running 6 vCPU and 32 GB RAM

    $sleepinterval = 75

    1. Paul Cunningham

      Alain’s comment above has an IsFileLocked check that would probably work better for everyone than guesstimating sleep intervals.

  48. Alain Arnould

    Hi Paul,

    Thank you very much to share your script. It helps me a lot. I made some minor modification (parameters) and instead of having a “sleep” in the script, I modify it to have a “IsFileLocked” that checks the use of the zipfile.
    Here is the first version…

    ==========================================

    [CmdletBinding()]
    param (
    [Parameter( Mandatory=$true)]
    [string]$NbOfMonths,

    [Parameter( Mandatory=$true)]
    [string]$Logpath,

    [Parameter( Mandatory=$true)]
    [string]$FileExt,

    [Parameter( Mandatory=$false)]
    [string]$ArchivePath
    )

    #————————————————-
    # Variables
    #————————————————-

    $sleepinterval = 1 # used with loop and IsFileLocked function

    $myDir = Split-Path -Parent $MyInvocation.MyCommand.Path
    $output = “$myDirLogsCleanup.log”

    $computername = $env:computername

    $date = Get-Date
    $rundate = ($date.ToString(“yyyy-MM-dd HH:mm:ss”))
    $currentmonth = ($date.ToString(“MM-yyyy”))
    $previousmonth = ($date.AddMonths(-$NbOfMonths).ToString(“MM-yyyy”))
    [string]$firstdayofpreviousmonth = (Get-Date “01/$previousmonth”).ToString(“yyyy-MM-dd HH:mm:ss”)

    #……………………………..
    # Logfile Strings
    #……………………………..

    $logstring0 = “=====================================”
    $logstring1 = ” Log File Cleanup Script”

    #————————————————-
    # Functions
    #————————————————-

    # This function is used to write the log file for the script
    Function Write-Logfile() {
    param( $logentry )
    $timestamp = (get-date -DisplayHint DateTime).tostring(‘yyyy-MM-dd HH:mm:ss’)
    “$timestamp $logentry” | Out-File $output -Append
    }

    # This function is to test the completion of the async CopyHere method
    function IsFileLocked( [string]$path) {
    If ([string]::IsNullOrEmpty($path) -eq $true) {
    Throw “The path must be specified.”
    }

    [bool] $fileExists = Test-Path $path

    If ($fileExists -eq $false) {
    Throw “File does not exist (” + $path + “)”
    }

    [bool] $isFileLocked = $true

    $file = $null

    Try {
    $file = [IO.File]::Open(
    $path,
    [IO.FileMode]::Open,
    [IO.FileAccess]::Read,
    [IO.FileShare]::None)

    $isFileLocked = $false
    }
    Catch [IO.IOException] {
    If ($_.Exception.Message.EndsWith(“it is being used by another process.”) -eq $false) {
    # Throw $_.Exception
    [bool] $isFileLocked = $true
    }
    }
    Finally {
    If ($file -ne $null) {
    $file.Close()
    }
    }

    return $isFileLocked
    }

    #————————————————-
    # Script
    #————————————————-

    #Log file is overwritten each time the script is run to avoid
    #very large log files from growing over time

    $timestamp = (get-date -DisplayHint DateTime).tostring(‘yyyy-MM-dd HH:mm:ss’)

    # $timestamp = Get-Date -DisplayHint Time
    Write-Host $logstring0
    Write-Host $logstring1
    Write-Host ” $rundate”
    “$timestamp $logstring0″ | Out-File $output
    Write-Host $logstring0
    Write-Logfile $logstring1
    Write-Logfile ” $date”
    Write-Logfile $logstring0

    #Check whether Logs path exists, exit if it does not
    if ((Test-Path $Logpath) -ne $true) {
    $tmpstring = “Log path $logpath not found”
    Write-Warning $tmpstring
    Write-Logfile $tmpstring
    EXIT
    }

    $tmpstring = “Current Month : $currentmonth”
    Write-Host $tmpstring
    Write-Logfile $tmpstring

    $tmpstring = “Previous Month: $previousmonth”
    Write-Host $tmpstring
    Write-Logfile $tmpstring

    #$tmpstring = “First Day of Previous Month: $firstdayofpreviousmonth”
    #Write-Host $tmpstring
    #Write-Logfile $tmpstring

    #Fetch list of log files older than 1st day of previous month
    $logstoremove = Get-ChildItem -Path “$($Logpath)*.*” -Include *.$FileExt | Where {$_.LastWriteTime -lt $firstdayofpreviousmonth -and $_.PSIsContainer -eq $false}

    ### DEBUG ON ###
    # Get-ChildItem -Path “$($Logpath)*.*” -Include *.$FileExt | Select Name, LastWriteTime
    # Write-Host $firstdayofpreviousmonth
    ### DEBUG OFF ###

    if ($($logstoremove.Count) -eq $null) {
    $logcount = 0
    } else {
    $logcount = $($logstoremove.Count)
    }

    $tmpstring = “Found $logcount logs earlier than $firstdayofpreviousmonth”
    Write-Host $tmpstring
    Write-Logfile $tmpstring

    #Init a hashtable to store list of log files
    $hashtable = @{}

    #Add each logfile to hashtable
    foreach ($logfile in $logstoremove) {
    $zipdate = $logfile.LastWriteTime.ToString(“yyyy-MM”)
    $hashtable.Add($($logfile.FullName),”$zipdate”)
    }

    #Calculate unique yyyy-MM dates from logfiles in hashtable
    $hashtable = $hashtable.GetEnumerator() | Sort Value
    $dates = @($hashtable | Group -Property:Value | Select Name)

    #For each yyyy-MM date add those logfiles to a zip file
    foreach ($date in $dates) {
    $zipfilename = “$Logpath$computername-$($date.Name).zip”

    if(-not (test-path($zipfilename))) {
    set-content $zipfilename (“PK” + [char]5 + [char]6 + (“$([char]0)” * 18))
    (dir $zipfilename).IsReadOnly = $false
    }

    $shellApplication = new-object -com shell.application
    $zipPackage = $shellApplication.NameSpace($zipfilename)

    $zipfiles = $hashtable | Where {$_.Value -eq “$($date.Name)”}

    $tmpstring = “Zip file name is $zipfilename and will contain $($zipfiles.Count) files”
    Write-Host $tmpstring
    Write-Logfile $tmpstring

    foreach($file in $zipfiles) {
    $fn = $file.key.ToString()

    $tmpstring = “Adding $fn to $zipfilename”
    Write-Host $tmpstring
    Write-Logfile $tmpstring

    $zipPackage.CopyHere($fn,16)

    #This is to avoids file lock/conflict issues regarding the Asynchronous CopyHere command
    do {
    Start-sleep -s $sleepinterval
    } while (IsFileLocked($zipfilename))

    }

    #Compare count of log files on disk to count of log files in zip file
    $zippedcount = ($zipPackage.Items()).Count

    $tmpstring = “Zipped count: $zippedcount”
    Write-Host $tmpstring
    Write-Logfile $tmpstring

    $tmpstring = “Files: $($zipfiles.Count)”
    Write-Host $tmpstring
    Write-Logfile $tmpstring

    #If counts match it is safe to delete the log files from disk
    if ($zippedcount -eq $($zipfiles.Count)) {
    $tmpstring = “Zipped file count matches log file count, safe to delete log files”
    Write-Host $tmpstring
    Write-Logfile $tmpstring
    # foreach($file in $zipfiles) {
    # $fn = $file.key.ToString()
    # Remove-Item $fn
    # }

    #If archive path was specified move zip file to archive path
    if ($ArchivePath) {
    #Check whether archive path is accessible
    if ((Test-Path $ArchivePath) -ne $true) {
    $tmpstring = “Log path $archivepath not found or inaccessible”
    Write-Warning $tmpstring
    Write-Logfile $tmpstring
    } else {
    #Check if subfolder of archive path exists
    if ((Test-Path $ArchivePath$computername) -ne $true) {
    try {
    #Create subfolder based on server name
    New-Item -Path $ArchivePath$computername -ItemType Directory -ErrorAction STOP
    }
    catch {
    #Subfolder creation failed
    $tmpstring = “Unable to create $computername subfolder in $archivepath”
    Write-Host $tmpstring
    Write-Logfile $tmpstring

    $tmpstring = $_.Exception.Message
    Write-Warning $tmpstring
    Write-Logfile $tmpstring
    }
    }

    try {
    #Move the zip file
    Move-Item $zipfilename -Destination $ArchivePath$computername -ErrorAction STOP
    $tmpstring = “$zipfilename was moved to $archivepath$computername”
    Write-Host $tmpstring
    Write-Logfile $tmpstring
    }
    catch {
    #Move failed, log the error
    $tmpstring = “Unable to move $zipfilename to $ArchivePath$computername”
    Write-Host $tmpstring
    Write-Logfile $tmpstring
    Write-Warning $_.Exception.Message
    Write-Logfile $_.Exception.Message
    }
    }
    }

    } else {
    $tmpstring = “Zipped file count does not match log file count, not safe to delete log files”
    Write-Host $tmpstring
    Write-Logfile $tmpstring
    }

    }

    #Finished
    $tmpstring = “Finished”
    Write-Host $tmpstring
    Write-Logfile $tmpstring

    ==========================================

  49. BoB

    Hi
    Line 59 (first day of previous month) is really clumsy and prone to errors depending on the local machine date/time settings.
    Use something like this instead:
    [string]$firstdayofpreviousmonth = (Get-Date -Day 01 -Hour 00 -Minute 00 -Second 00).Addmonths(-1)

    1. Paul Cunningham

      Yes it is. Wouldn’t it be easier if North America just switched to the same date format as everyone else? 😉

      I plan on fixing that soon. Thanks for the sample code.

  50. Tommas

    Hi!

    I have several W3SVC folder under LogFiles folder. There are log files with same names under the W3SVC folders, and the script archive (zip) section can not handle the subfolders, so the log files with same name can not be zipped. How should I modify the zipping section to create the same folder structure in the zip file to the source?

    Thanks,
    Tamas

    1. Paul Cunningham

      If you’re using a separate scheduled task for each IIS site you could use the -ArchivePath parameter to specify different UNC paths for each one. Not ideal but doesn’t require any code changes.

      Otherwise, the $archivepath variable is what you would need to modify. You can see in the script anywhere that $archivepath$computername is used you could change that to something else that suits your scenario.

      I’ll make a note for the next version to handle multiple sites better though.

      1. Tommas

        Hi!
        Thank You! Yes, it is a solution, but i would like folders in the zipped files rather then the directory creation,
        T.

  51. Jimmy

    Thanks for sharing Paul.

  52. Kelley Underwood

    I have found that the last day of the month is not zipped and archived with the previous days. I’m assuming this is because the date modified value on that file is actually the first day of the new month since that is when it was closed and a new log begun. Is there any way around this?

    1. Paul Cunningham

      Perhaps if you change the script to look at file creation date instead of last modified date?

  53. Feras Mustafa

    Hi Paul,
    Great script…. !!!
    Any plans to expand that to do the Exchange 2010 logs (RPC, EWS, AB, etc.) as well??

    1. Feras Mustafa

      Here is another enhacment suggestion :), Can the script be used to do the logs clean-up on multiple servers in a single run??

  54. Rob Pettigrew

    Paul it seems to only clean log files up to January 3rd. First day of previous month should be 03/01/2013 not 01/03/2014.

    Previous Month: 03/14
    First Day of Previous Month: 01/03/2014 00:00:00
    Found 0 logs earlier than 01/03/2014 00:00:00

    Line 59 is where that first day is derived. [string]$firstdayofpreviousmonth = Get-Date “01/$previousmonth”

    I am just not certain how to insert the 01 in the middle without breaking apart month and year as separate variables. Are you seeing this problem as well or is your date formatting different than mine?

    1. Rob Pettigrew

      To fix my problem I just added

      $lastMonth = (get-date).AddMonths(-1)

      And then modified your firstdayofpreviousmonth variable.

      [string]$firstDayOfPreviousMonth = (get-date -year $lastMonth.Year -month $lastMonth.Month -day 1).ToString(“MM/dd/yy”)

      Script is running again and seems to be working perfectly I will let you know if I run into any issues.

      1. Lawrence Lake

        The scripts works perfectly, the date format is DD/MM/YY check and you’ll notice that it does get all the files for the previous month(s). Thanks for yet another brilliant script, Paul

        1. Rob Pettigrew

          Yes, that works perfectly in a dd/mm/yy region. However, the US is mm/dd/yy thus the need for us to tweak it slightly.

      2. Kelley Underwood

        I have made these modifications and have some log files from March that still are not being found. They do not start with the first of the month. Here are the two lines added/modified:

        $lastMonth = (get-date).AddMonths(-1)
        [string]$firstDayOfPreviousMonth = (get-date -year $lastMonth.Year -month $lastMonth.Month -day 1).ToString(“MM/dd/yy”)

        Any ideas appreciated.

        1. Rob Pettigrew

          My guess is that you ran the script at some point before March ended which would have created the HOSTNAME-2014-03.zip file and compressed then deleted log files up to that date in March. Then when it ran again it compressed the new remaining March log files into the HOSTNAME-2014-03.zip file but could not delete them because of the difference in the number of files compressed at that time and the number of files total in the zip file. It does a check to see how many files were compressed and how many files are in the zip file and if that is different then the delete function bails to save you from deleting log files that may not be backed up. Double check your zip file and see if the dates that were not deleting are in fact in the zip file.

        2. Kelley Underwood

          I do not have a zip file at all. So far the script has not found anything to zip. I’ve not made any modifications to the logging and the log files are named as such: u_ex140323.log with yymmdd in the name.

        3. Rob Pettigrew

          Couldnt reply directly to your comment for some reason. So you verified that the log file u_ex140323.log has a modified date of 03/23/2014 or something close to that. Not in April or something. Can you run the script from a powershell command prompt to verify its not failing the execution signing policy?

          This is the chunk of code I modified just to make sure it looks similar to what you did. You can see my new $lastMonth variable and the modified string variable.

          $date = Get-Date
          $currentmonth = ($date.ToString(“MM/yy”))
          $previousmonth = ($date.AddMonths(-1).ToString(“MM/yy”))
          $lastMonth = (get-date).AddMonths(-1)
          [string]$firstDayOfPreviousMonth = (get-date -year $lastMonth.Year -month $lastMonth.Month -day 1).ToString(“MM/dd/yy”)

        4. Kelley Underwood

          My script looks exactly like yours. Below is the screen output when I run directly from powershell.
          [PS] C:scripts>.IISLogsCleanup.ps1 -Logpath “C:inetpublogsLogFilesW3SVC1”
          Current Month: 05/14
          Previous Month: 04/14
          First Day of Previous Month: 01/04/2014 00:00:00
          Found 0 logs earlier than 01/04/2014 00:00:00
          Finished
          I have log files in this path from March 23 until today. The date modified is always the day after the logs. For example for the March 23 log its modified date is March 24.

        5. Rob Pettigrew

          That helps alot. I see this”

          First Day of Previous Month: 01/04/2014 00:00:00
          Found 0 logs earlier than 01/04/2014 00:00:00

          Which leads me to believe you still have this line in there

          [string]$firstdayofpreviousmonth = Get-Date “01/$previousmonth”

          It must be after the string that I modified thus modifying that variable to 01/previousmonth giving you 01/04/2014. You can put a hashtag # in front of that line to comment it out in case you need/want to use it later. Here is a larger snippet of the code so you can see where i commented out the previous firstdayofpreviousmonth string with the hashtag in front of it.

          $computername = $env:computername

          $date = Get-Date
          $currentmonth = ($date.ToString(“MM/yy”))
          $previousmonth = ($date.AddMonths(-1).ToString(“MM/yy”))
          $lastMonth = (get-date).AddMonths(-1)
          [string]$firstDayOfPreviousMonth = (get-date -year $lastMonth.Year -month $lastMonth.Month -day 1).ToString(“MM/dd/yy”)
          #[string]$firstdayofpreviousmonth = Get-Date “01/$previousmonth”

          $myDir = Split-Path -Parent $MyInvocation.MyCommand.Path
          $output = “$myDirIISLogsCleanup.log”

        6. Kelley Underwood

          I do have that line commented out. I’ve compared your snippet to mine and it looks the same. When you run the script do you see 4/1/2014 where I’m seeing 1/4/2014?

        7. Rob Pettigrew

          Yes my log files looks like this

          05/01/2014 01:00:02 =====================================
          05/01/2014 01:00:02 IIS Log File Cleanup Script
          05/01/2014 01:00:02 05/01/2014 01:00:02
          05/01/2014 01:00:02 =====================================
          05/01/2014 01:00:02 Current Month: 05/14
          05/01/2014 01:00:02 Previous Month: 04/14
          05/01/2014 01:00:02 First Day of Previous Month: 04/01/14
          05/01/2014 01:00:02 Found 31 logs earlier than 04/01/14

          You are in the USA correct? Verify the region on your machine is set correctly to English United States. M/d/yyyy etc.

          Perhaps download the script fresh make the modifications and try again? Could also type in each line one at a time and then type the variable name to see what the current value is. Start with the date variable and make your way to firstdayofpreviousmonth.

          Type $date = Get-Date. Then type $date and it should return todays date. Type the next variable and check its value. Do this all the way until firstdayofpreviousmonth and that should return 04/01/14.

        8. Kelley Underwood

          Thank you for your kind help. I’ve stepped through all the variables in PS and they all return the proper value including 04/01/2014 but when the script runs it writes 01/04/2014 to the log file. I’ll try with a clean copy of the script. I must have something messed up.

        9. Kelley Underwood

          I’m so embarrassed. When I went to get a clean copy of the script, I found I had been editing the file in my download folder and not in my scripts folder. When I ran the correct copy, all worked as it should.

        10. Rob Pettigrew

          Ha no big deal. Glad it was something simple like that 🙂

    2. Rob Pettigrew

      I also realized why you had the script written this way…Australia! Makes good sense now why it required some modifications for US date formatting. Thanks again for this script everything else works perfectly!

      1. Paul Cunningham

        You know the USA is the only country that uses MM/DD/YYYY right? Maybe it’s time you all changed 😉

        I’ll try and incorporate a fix in the next version so this modification isn’t necessary.

  55. Bill

    I get this on another server.

    PS C:UsersAdministratordesktop> .IISLogsCleanup.ps1 -Logpath “D:logsW3SVC1”
    Current Month: 04/14
    Previous Month: 03/14
    First Day of Previous Month: 01/03/2014 00:00:00
    Found 0 logs earlier than 01/03/2014 00:00:00
    Finished

    Thanks.
    Bill

    1. Bill

      I found the error in this one.

      1. Bill

        # Variables
        #————————————————-

        $sleepinterval = 10

        $computername = $env:computername

        $date = Get-Date
        $currentmonth = ($date.ToString(“MM/yy”))
        $previousmonth = ($date.AddMonths(-1).ToString(“MM/yy”))
        [string]$firstdayofpreviousmonth = Get-Date “01/$previousmonth”

        $myDir = Split-Path -Parent $MyInvocation.MyCommand.Path
        $output = “$myDirIISLogsCleanup.log”

        I removed the 01 above and that part works.

        Bill

        1. Rob Pettigrew

          Removing the 01 effectively makes that variable the same as the previousmonth variable. It is not getting the first day of the previous month but putting the year in the day column for the previous month so your log rotations will always go into 14th of the previous month. Next year they will go into the 15th of the previous month. You can see the output I have below after removing the 01:

          Current Month: 04/14
          Previous Month: 03/14
          First Day of Previous Month: 03/14/2014 00:00:00
          Found 373 logs earlier than 03/14/2014 00:00:00

        2. Rob Pettigrew

          Furthermore and more importantly you will get “Zipped file count does not match log file count, not safe to delete log files” errors since the script will be zipping up files mid month. The whole purpose of the script would be defeated at this point. See my solution below for US date formatting.

  56. Bill

    I am getting this error:

    PS C:UsersAdministratordesktop> .IISLogsCleanup.ps1 -Logpath “c:logsW3SVC1”
    Current Month: 04/14
    Previous Month: 03/14
    First Day of Previous Month: 01/03/2014 00:00:00
    Found 0 logs earlier than 01/03/2014 00:00:00
    You cannot call a method on a null-valued expression.
    At C:UsersAdministratordesktopIISLogsCleanup.ps1:144 char:47
    + $zipdate = $logfile.LastWriteTime.ToString <<<< ("yyyy-MM")
    + CategoryInfo : InvalidOperation: (ToString:String) [], RuntimeException
    + FullyQualifiedErrorId : InvokeMethodOnNull

    Exception calling "Add" with "2" argument(s): "Key cannot be null.
    Parameter name: key"
    At C:UsersAdministratordesktopIISLogsCleanup.ps1:145 char:19
    + $hashtable.Add <<<< ($($logfile.FullName),"$zipdate")
    + CategoryInfo : NotSpecified: (:) [], MethodInvocationException
    + FullyQualifiedErrorId : DotNetMethodException

    What am I doing wrong?
    Thanks.
    Bill

    1. Chris

      I get this as well.

  57. Will Fulmer

    Thanks Paul! Excellent resource!

  58. Tommas

    Hi!

    On an exchange server, I get this message in the IISLogsCleanup.log:

    “Zipped file count does not match log file count, not safe to delete log files”

    Any Idea?
    Thanks,
    Tamas

    1. Paul Cunningham

      Try increasing the $sleepinterval to allow more time for each file to be added to the Zip before the next one is tried.

      1. sai teja

        Hai the code works good for me I have requirement that the code should automatically search for log files instead of I giving the mandatory path plz help me

  59. Charles Derber

    Thanks Paul 🙂

Leave a Reply