In your previous questions your starting point was the Get-DfsrMembership
cmdlet. In this question you've switched to starting with Get-ADComputer
. What others have been trying to point out is properties returned from Get-DfdMembership
aren't available. In particular $_.ContentPath
isn't available in this scenario. Furthermore, Get-ADComputer
returns a Name property not ComputerName.
That said, your real question as described in your comments:
however, the challenge is how to list the file names, size and
location in each respective servers and export as .CSV
This is a question about patterns & concepts. You're conflating a value that has meaning for the entire collection (the sum) with the individual elements of the collection. The sum therefore can't really be stored in the CSV. Imagine that each line of the CSV has the information from one of the 32 files, well then where on that line would you put the Sum? It has no meaning with respect to anything else on that line.
Conceptually objects are self-contained. It's a conundrum to try to squeeze an unrelated property in like this. And, the flat-ish nature of CSV files makes it even more challenging.
So how to deal with this?
Note: Because of the $_.ContentPath
problem I've hardcoded the path
below. This is for demonstration not production!
$Computers = ( Get-ADComputer -Filter { Name -like "FileServer*" } ).Name
$scriptBlock = {
param ([string]$Path)
$noEmpties = [StringSplitOptions]::RemoveEmptyEntries
robocopy /L /E /NDL /NJH /NJS /NP /NC /BYTES $Path 'NoDestination' |
ForEach-Object{
# If( !$_ ){ Continue }
$TempArr = $_.Split( " `t", 2, $noEmpties )
[PSCustomObject]@{
FilePath = $TempArr[1]
FileName = $TempArr[1].Split( '' )[-1]
Length = [Int64]$TempArr[0]
}
} |
Sort-Object -Property Length | Select-Object -Last 32
}
Try
{
$Results = Invoke-Command -ComputerName $Computers -ScriptBlock $scriptBlock -ArgumentList 'C:Temp2' |
Group-Object -Property PSComputerName |
ForEach-Object{
[PSCustomObject]@{
ComputerName = [String]$_.Name
'Server - IP' = "$($_.Name) [$((Resolve-DnsName -Name $_.Name -Type A).IPAddress)]"
'Top 32 Largest Files Size' = [Math]::Round( ($_.Group.Length | Measure-Object -Sum).Sum/1GB, 2 )
'Top 32 Largest Files' = [Object[]]$_.Group | Select-Object FilePath,FileName,Length
}
}
# Export to CSV
$Results |
ForEach-Object{
$CSVFileName = $_.ComputerName + '-BigFiles.csv'
$_.'Top 32 Largest Files' | Export-Csv -Path $CSVFileName -NoTypeInformation
}
}
Catch
{
$_ | Write-Error
}
Note: Again this is imperfect, I'm not as focused on efficiency and performance at the moment. I'm not yet sure if I could've eliminated a a loop or 2.
Because of your shifting requirements we have to pull 2 things from the RoboCopy output.
- The File Size
- The File Path
Then from the file path we can derive the file name pretty easily. As my previous work I've used some cagey .Split()
invocations to get at that. I'm sure that @Theo may have some RegEx approach...
So the first thing is I have the remote side generating an object with only these properties. I'm doing the rest of the work locally. The reason is I need both properties, we are no longer returning a single value from the remote machine.
With the results I'm creating a new set of objects containing the ComputerName, the 'Server IP', the 'Top 32 Largest Files Size' AND 'Top 32 Largest Files'. That last property is an array of objects with those same 3 properties.
Now I can loop $Results
and create the CSV files naming them according to the $ComputerName
property.
This is imperfect for sure. One alternative might be to sub-delimit the FileNames with something like ";" That would sufficiently flatten the object for the CSV file. Though we'd have to abandon the Name & length properties. Flat objects are easier to deal with, but a sub-delimited field then needs to be dealt with on input to whatever other process you use it for.
Another question is if you really need to store the output in a CSV file. JSON for example is much better for storing hierarchical objects. ExportImport-Clixml
may also be of interest. Of course and again it may depend on where you want to later consume that data etc...
I think all this points to a need to shift your requirements to meet the task.