I've read plenty of articles (and a couple of other similar questions that were posted on StackOverflow) about how and when to use assertions, and I understood them well. But still, I don't understand what kind of motivation should drive me to use Debug.Assert
instead of throwing a plain exception. What I mean is, in .NET the default response to a failed assertion is to "stop the world" and display a message box to the user. Though this kind of behavior could be modified, I find it highly annoying and redundant
to do that, while I could instead, just throw a suitable exception. This way, I could easily write the error to the application's log just before I throw the exception, and plus, my application doesn't necessarily freeze.
So, why should I, if at all, use Debug.Assert
instead of a plain exception? Placing an assertion where it shouldn't be could just cause all kinds of "unwanted behavior", so in my point of view, I really don't gain anything by using an assertion instead of throwing an exception. Do you agree with me, or am I missing something here?
Note: I fully understand what the difference is "in theory" (Debug vs Release, usage patterns etc.), but as I see it, I would be better off throwing an exception instead of performing an assert. Since if a bug is discovered on a production release, I still would want the "assertion" to fail (after all, the "overhead" is ridiculously small), so I'll be better off throwing an exception instead.
Edit: The way I see it, if an assert failed, that means that the application entered some kind of corrupted, unexpected state. So why would I want to continue execution? It doesn't matter if the application runs on a debug or release version. The same goes for both
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…