I've implemented some sorting algorithms (to sort integers) in C, carefully using uint64_t
to store anything which has got to do with the data size (thus also counters and stuff), since the algorithms should be tested also with data sets of several giga of integers.
The algorithms should be fine, and there should be no problems about the amount of data allocated: data is stored on files, and we only load little chunks per time, everything works fine even when we choke the in-memory buffers to any size.
Tests with datasets up to 4 giga ints (thus 16GB of data) work fine (sorting 4Gint took 2228 seconds, ~37 minutes), but when we go above that (ie: 8 Gints) the algorithm doesn't seem to halt (it's been running for about 16 hours now).
I'm afraid the problem could be due to integer overflow, maybe a counter in a loop is stored on a 32 bits variable, or maybe we're calling some functions that works with 32 bits integers.
What else could it be?
Is there any easy way to check whether an integer overflow occurs at runtime?
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…