I used several aproaches in the past, Using Out-File -Append is working fine too for a "quick info" but here my favorite using the .Net Streamwriter which is the fastest and most reliable way.
1. Define your Logfile, here i stamp it with DAY only, be aware this approach will overwrite your file with every run if has the same name - if you want to run it more than once per day just put a more detailed timestamp ("yyyyMMdd-hhmm") in it.
$LogfileName = $(-join('c:\_Script\',$(Get-Date -Format "yyyyMMdd"),'_MyScript.log'))
$global:Logfile = [System.IO.StreamWriter] $LogfileName
$Logfile.AutoFlush = $true
2. Define a function for "quick" entrys.
function LogEntry ($Log) {@($Log)|%{
if ($_) {
$Logfile.WriteLine((Get-Date -Format 'yyyy-MM-dd HH:mm:ss:fff ') + $_.ToString())
if ($_) {
$Logfile.WriteLine((Get-Date -Format 'yyyy-MM-dd HH:mm:ss:fff ') + $_.ToString())
}
}
}
3. How to use it ..
LogEntry("My Script was started")
Variables can put in ..
$user=&'whoami'
LogEntry("I run under the Account $user")
Be aware that when you want to log a property etc you have to enclose the variable with $() ...
$vm=get-view -type VirtualMachine Filter @{"Name"=$vmname}
LogEntry "MyScript is messing up the VM $($vm.Name) which has Status $($vm.Runtime.PowerState) ."
If your output has several lines ...
Set-NaCifsShareAcl -User $user -Share $share -AccessRights 'Full Control' -OutVariable Log -ErrorAction Stop
LogEntry($Log|Out-String -Stream)
If you dont trust your Command use it in try catch and get the Errormessage in your Logfile.
try {
$data = Import-CSV '\\fileserver\mydata.csv' -ErrorAction Stop
LogEntry("Importing Data from Share went fine, got $($data.Count) entries.")
} catch {
LogEntry("Importing Data from Share went wrong with $($_.Exception.Message) .")
}
This do his job in most cases, and you get usefull informations in your log.
4. But how about when you start some long running command like a backuptask from command line. You can capture the output in a variable and enter it in the log - but the caveeat is that the entries will only show up AFTER the command is done and all lines will share the timestamp when it was finished. In worst case you have a hanging command/process and you have no glue which state it has.
So how to fill up the logfile while the command is still running in realtime, so we can watch it with baretail or similar ?
We need another function, which does only the timestamping and is able to pipe and stream.
function TimeStampIt {process{"$(Get-Date -Format 'yyyy-MM-dd HH:mm:ss:fff ') $_"}}
Now we pipe the console output of our command through this function line by line to the WriteLine method of the Logfile Object. The "%" is a alias for "foreach".
My Labsample is a simple cmd File which ping local host.
This is the result in your Logfile, you see every line has it own Timestamp, while the command is running you can watch the logfile per Baretail or similar.
2015-10-23 10:35:11:574
2015-10-23 10:35:11:574 C:\Users\blablabla>ping 127.0.0.1 -n 5
2015-10-23 10:35:11:590
2015-10-23 10:35:11:590 Pinging 127.0.0.1 with 32 bytes of data:
2015-10-23 10:35:11:590 Reply from 127.0.0.1: bytes=32 time<1ms TTL=128
2015-10-23 10:35:12:604 Reply from 127.0.0.1: bytes=32 time<1ms TTL=128
2015-10-23 10:35:13:618 Reply from 127.0.0.1: bytes=32 time<1ms TTL=128
2015-10-23 10:35:14:632 Reply from 127.0.0.1: bytes=32 time<1ms TTL=128
2015-10-23 10:35:15:646 Reply from 127.0.0.1: bytes=32 time<1ms TTL=128
2015-10-23 10:35:15:646
2015-10-23 10:35:15:646 Ping statistics for 127.0.0.1:
2015-10-23 10:35:15:646 Packets: Sent = 5, Received = 5, Lost = 0 (0% loss),
2015-10-23 10:35:15:646 Approximate round trip times in milli-seconds:
2015-10-23 10:35:15:646 Minimum = 0ms, Maximum = 0ms, Average = 0ms
5. Dont forget the close the Logfile in the end of the Script.
LogEntry("Done my Job.")
$Logfile.Close()
Do you mean probably ran? run is an irregular verb.
AntwortenLöschenKind regards :D
Thanks .. corrected it. "running commands"
Löschen