I can't guarantee that this is the fastest, but it's certainly quite efficient:
bool areEquivalent = array1.Length == array2.Length
&& new HashSet<string>(array1).SetEquals(array2);
EDIT:
SaeedAlg and Sandris raise valid points about different frequencies of duplicates causing problems with this approach. I can see two workarounds if this is important (haven't given much thought to their respective efficiencies):
1.Sort the arrays and then compare them sequentially. This approach, in theory, should have quadratic complexity in the worst case.
E.g.:
return array1.Length == array2.Length
&& array1.OrderBy(s => s).SequenceEqual(array2.OrderBy(s => s));
2.Build up a frequency-table of strings in each array and then compare them. E.g.:
if(array1.Length != array2.Length)
return false;
var f1 = array1.GroupBy(s => s)
.Select(group => new {group.Key, Count = group.Count() });
var f2 = array2.GroupBy(s => s)
.Select(group => new {group.Key, Count = group.Count() });
return !f1.Except(f2).Any();
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…