I have this code snippet:
Random rand = new Random();
int chance = rand.Next(1, 101);
if (chance <= 25) // probability of 25%
{
Console.WriteLine("You win");
}
else
{
Console.WriteLine("You lose");
}
My question is, does it really calculate a 25% probability for winning here?
Is the chance of winning for the player here is really 25%?
Edit:
I just wrote this:
double total = 0;
double prob = 0;
Random rnd = new Random();
for (int i = 0; i < 100; i++)
{
double chance = rnd.Next(1, 101);
if (chance <= 25) prob++;
total++;
}
Console.WriteLine(prob / total);
Console.ReadKey();
And it's highly inaccurate. It goes from about 0.15 to 0.3.
But when I do more checks (change from (i < 100) to (i < 10000)) it's much more accurate.
Why is this? Why aren't 100 checks enough for it to be accurate?
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…