Your best bet would be to restructure your code to use async ajax calls and launch the next call when the first one completes and so on. This will allow the page to redisplay between image fetches.
This will also give the browser a chance to breathe and take care of its other housekeeping and not think that maybe it's locked up or hung.
And, use async: 'false'
is a bad idea. I see no reason why properly structured code couldn't use asynchronous ajax calls here and not hang the browser while you're fetching this data.
You could do it with asynchronous ajax like this:
function getAllImages(position, maxImages) {
var imgCount = 0;
function getNextImage() {
$.ajax({
url: urlAJAX + 'scan=' + position,
method: 'GET',
async: true,
success: function(data) {
if (data.status == "success" && imgCount <= maxImages) {
++imgCount;
renderImageData(data);
getNextImage();
}
}
});
}
getNextImage();
}
// no while loop is needed
// just call getAllImages() and pass it the
// position and the maxImages you want to retrieve
getAllImages('front', 20);
Also, while this may look like recursion, it isn't really recursion because of the async nature of the ajax call. getNextImage()
has actually completed before the next one is called so it isn't technically recursion.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…