I have been using robocopy to copy files from a lot of different places to one main storage so it can be backed up to tape. Some of the data is very large, 2TB. I noticed that when I run these batch files from task scheduler, that they use much less bandwidth on the network, which means they are even slower. I have a 10 GB backup network that I seem to be barely using.
I thought If I could create a script that would spawn off several robocopy process at once, that I could speed up the entire process. It seems to have made things slower, and my memory utilization goes through the roof. I know robocopy pretty good, I know about the /MT:32 switch which give me the ability to assign more cpu to the process.
If I manually run 3 robocopy commands at once, I can speed things up, but I can't seem to duplicate that with the script or from task schedule.
Here is my script, any ideas?
$filelist = Get-ChildItem -Path C:\test -Recurse -Force
foreach ( $file in $filelist ) {
if (!$file.PSIsContainer) {
$prep = $file.DirectoryName
$dest = $prep.Replace("C:\test","\\atl01osi357\K$\SVRBKUPS\test")
$xfile = $file.Name
$sb = { robocopy $($args[0]) $($args[1]) $($args[2]) /E /ZB /copyall }
$job = Start-Job -ScriptBlock $sb -ArgumentList $prep,$dest,$file
}
}