My psychic powers are not great, and it is hard to tell what is going on without actually debugging it. But here's a guess. The issue I discuss here:
Why does this floating-point calculation give different results on different machines?
applies not just to "cross machine" but also to "debug vs release". It is not only possible but likely that the release version of your program is using higher precision math than your debug version. If you have floating point bugs in there then it is entirely possible that just by sheer bad luck you are hitting the bugs only in the higher-precision release version and not in the lower-precision debug version.
Why the difference? Because in the unoptimized version the C# compiler frequently generates code for temporary values as though they were local variables; the jitter then actually allocates temporary locals on the stack, and writes the temporary values from the registers to the locals. Then when it needs them, it reads them back into registers from the temporaries. That journey can cause the value that was in the high-precision register to be truncated to mere 64 bit precision, losing bits of precision.
In the optimized version the C# compiler and the jitter work harder to keep everything in registers all the time, because obviously that is faster and higher precision, though harder to debug.
Good luck. Bugs that only repro in release mode are a total pain.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…