Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
1.4k views
in Technique[技术] by (71.8m points)

powershell - Copy-Item with timeout

I am recovering files from a hard drive wherein some number of the files are unreadable. I'm unable to change the hardware level timeout / ERC, and it's extremely difficult to work around when I have several hundred thousand files, any tens of thousands of which might be unreadable.

The data issues were the result of a controller failure. Buying a matching drive (all the way down), I've been able to access the drive, and can copy huge swaths of it without issues. However, there are unreadable files dotted throughout the drive that, when accessed, will cause the SATA bus to hang. I've used various resumable file copy applications like robocopy, RichCopy, and a dozen others, but they all have the same issue. They have a RETRY count that is based on actually getting an error reported from the drive. The issue is that the drive is taking an extremely long time to report the error, and this means that a single file may take up to an hour to fail officially. I know how fast each file SHOULD be, so I'd like to build a powershell CMDLET or similar that will allow me to pass in a source and destination file name, and have it try to copy the file. If, after 5 seconds, the file hasn't copied (or if it has - this can be a dumb process), I'd like it to quit. I'll write a script that fires off each copy process individually, waiting for the process before it to finish, but I'm so far unable to find a good way of putting a time limit on the process.

Any suggestions you might have would be greatly appreciated!

Edit: I would be happy with spawning a Copy-Item in a new thread, with a new PID, then counting down, then killing that PID. I'm just a novice at PowerShell, and have seen so many conflicting methods for imposing timers that I'm lost on what the best practices way would be.

Edit 2: Please note that applications like robocopy will utterly hang when encountering the bad regions of the disk. These are not simple hangs, but bus hangs that windows will try to preserve in order to not lose data. In these instances task manager is unable to kill the process, but Process Explorer IS. I'm not sure what the difference in methodology is, but regardless, it seems relevant.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

I'd say the canonical way of doing things like this in PowerShell are background jobs.

$timeout = 300 # seconds

$job = Start-Job -ScriptBlock { Copy-Item ... }
Wait-Job -Job $job -Timeout $timeout
Stop-Job -Job $job
Receive-Job -Job $job
Remove-Job -Job $job

Replace Copy-Item inside the scriptblock with whatever command you want to run. Beware though, that all variables you want to use inside the scriptblock must be either defined inside the scriptblock, passed in via the -ArgumentList parameter, or prefixed with the using: scope qualifier.

An alternative to Wait-Job would be a loop that waits until the job is completed or the timeout is reached:

$timeout = (Get-Date).AddMinutes(5)
do {
  Start-Sleep -Milliseconds 100
} while ($job.State -eq 'Running' -and (Get-Date) -lt $timeout)

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...