Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
942 views
in Technique[技术] by (71.8m points)

objective c - Non-lazy image loading in iOS

I'm trying to load UIImages in a background thread and then display them on the iPad. However, there's a stutter when I set the imageViews' view property to the image. I soon figured out that image loading is lazy on iOS, and found a partial solution in this question:

CGImage/UIImage lazily loading on UI thread causes stutter

This actually forces the image to be loaded in the thread, but there's still a stutter when displaying the image.

You can find my sample project here: http://www.jasamer.com/files/SwapTest.zip (edit: fixed version), check the SwapTestViewController. Try dragging the picture to see the stutter.

The test-code I created that stutters is this (the forceLoad method is the one taken from the stack overflow question I posted above):

NSArray* imagePaths = [NSArray arrayWithObjects:
                       [[NSBundle mainBundle] pathForResource: @"a.png" ofType: nil], 
                       [[NSBundle mainBundle] pathForResource: @"b.png" ofType: nil], nil];

NSOperationQueue* queue = [[NSOperationQueue alloc] init];

[queue addOperationWithBlock: ^(void) {
    int imageIndex = 0;
    while (true) {
        UIImage* image = [[UIImage alloc] initWithContentsOfFile: [imagePaths objectAtIndex: imageIndex]];
        imageIndex = (imageIndex+1)%2;
        [image forceLoad];

        //What's missing here?

        [self performSelectorOnMainThread: @selector(setImage:) withObject: image waitUntilDone: YES];
        [image release];
    }
}];

There are two reasons why I know the stuttering can be avoided:

(1) Apple is able to load images without stuttering in the Photos app

(2) This code does not cause stutter after placeholder1 and placeholder2 have been displayed once in this modified version of the above code:

    UIImage* placeholder1 = [[UIImage alloc] initWithContentsOfFile:[[NSBundle mainBundle] pathForResource: @"a.png" ofType: nil]];
[placeholder1 forceLoad];
UIImage* placeholder2 = [[UIImage alloc] initWithContentsOfFile:[[NSBundle mainBundle] pathForResource: @"b.png" ofType: nil]];
[placeholder2 forceLoad];

NSArray* imagePaths = [NSArray arrayWithObjects:
                       [[NSBundle mainBundle] pathForResource: @"a.png" ofType: nil], 
                       [[NSBundle mainBundle] pathForResource: @"b.png" ofType: nil], nil];
NSOperationQueue* queue = [[NSOperationQueue alloc] init];
[queue addOperationWithBlock: ^(void) {
    int imageIndex = 0;
    while (true) {
        //The image is not actually used here - just to prove that the background thread isn't causing the stutter
        UIImage* image = [[UIImage alloc] initWithContentsOfFile: [imagePaths objectAtIndex: imageIndex]];
        imageIndex = (imageIndex+1)%2;
        [image forceLoad];

        if (self.imageView.image==placeholder1) {
            [self performSelectorOnMainThread: @selector(setImage:) withObject: placeholder2 waitUntilDone: YES];
        } else {
            [self performSelectorOnMainThread: @selector(setImage:) withObject: placeholder1 waitUntilDone: YES];
        }                

        [image release];
    }
}];

However, I can't keep all my images in memory.

This implies that forceLoad doesn't do the complete job - there's something else going on before the images are actually displayed. Does anyone know what that is, and how I can put that into the background thread?

Thanks, Julian

Update

Used a few of Tommys tips. What I figured out is that it's CGSConvertBGRA8888toRGBA8888 that's taking so much time, so it seems it's a color conversion that's causing the lag. Here's the (inverted) call stack of that method.

Running        Symbol Name
6609.0ms        CGSConvertBGRA8888toRGBA8888
6609.0ms         ripl_Mark
6609.0ms          ripl_BltImage
6609.0ms           RIPLayerBltImage
6609.0ms            ripc_RenderImage
6609.0ms             ripc_DrawImage
6609.0ms              CGContextDelegateDrawImage
6609.0ms               CGContextDrawImage
6609.0ms                CA::Render::create_image_by_rendering(CGImage*, CGColorSpace*, bool)
6609.0ms                 CA::Render::create_image(CGImage*, CGColorSpace*, bool)
6609.0ms                  CA::Render::copy_image(CGImage*, CGColorSpace*, bool)
6609.0ms                   CA::Render::prepare_image(CGImage*, CGColorSpace*, bool)
6609.0ms                    CALayerPrepareCommit_(CALayer*, CA::Transaction*)
6609.0ms                     CALayerPrepareCommit_(CALayer*, CA::Transaction*)
6609.0ms                      CALayerPrepareCommit_(CALayer*, CA::Transaction*)
6609.0ms                       CALayerPrepareCommit_(CALayer*, CA::Transaction*)
6609.0ms                        CALayerPrepareCommit
6609.0ms                         CA::Context::commit_transaction(CA::Transaction*)
6609.0ms                          CA::Transaction::commit()
6609.0ms                           CA::Transaction::observer_callback(__CFRunLoopObserver*, unsigned long, void*)
6609.0ms                            __CFRUNLOOP_IS_CALLING_OUT_TO_AN_OBSERVER_CALLBACK_FUNCTION__
6609.0ms                             __CFRunLoopDoObservers
6609.0ms                              __CFRunLoopRun
6609.0ms                               CFRunLoopRunSpecific
6609.0ms                                CFRunLoopRunInMode
6609.0ms                                 GSEventRunModal
6609.0ms                                  GSEventRun
6609.0ms                                   -[UIApplication _run]
6609.0ms                                    UIApplicationMain
6609.0ms                                     main         

The last bit-mask changes he proposed didn't change anything, sadly.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

UIKit may be used on the main thread only. Your code is therefore technically invalid, since you use UIImage from a thread other than the main thread. You should use CoreGraphics alone to load (and non-lazily decode) graphics on a background thread, post the CGImageRef to the main thread and turn it into a UIImage there. It may appear to work (albeit with the stutter you don't want) in your current implementation, but it isn't guaranteed to. There seems to be a lot of superstition and bad practice advocated around this area, so it's not surprising you've managed to find some bad advice...

Recommended to run on a background thread:

// get a data provider referencing the relevant file
CGDataProviderRef dataProvider = CGDataProviderCreateWithFilename(filename);

// use the data provider to get a CGImage; release the data provider
CGImageRef image = CGImageCreateWithPNGDataProvider(dataProvider, NULL, NO, 
                                                    kCGRenderingIntentDefault);
CGDataProviderRelease(dataProvider);

// make a bitmap context of a suitable size to draw to, forcing decode
size_t width = CGImageGetWidth(image);
size_t height = CGImageGetHeight(image);
unsigned char *imageBuffer = (unsigned char *)malloc(width*height*4);

CGColorSpaceRef colourSpace = CGColorSpaceCreateDeviceRGB();

CGContextRef imageContext =
    CGBitmapContextCreate(imageBuffer, width, height, 8, width*4, colourSpace,
                  kCGImageAlphaPremultipliedFirst | kCGBitmapByteOrder32Little);

CGColorSpaceRelease(colourSpace);

// draw the image to the context, release it
CGContextDrawImage(imageContext, CGRectMake(0, 0, width, height), image);
CGImageRelease(image);

// now get an image ref from the context
CGImageRef outputImage = CGBitmapContextCreateImage(imageContext);

// post that off to the main thread, where you might do something like
// [UIImage imageWithCGImage:outputImage]
[self performSelectorOnMainThread:@selector(haveThisImage:) 
         withObject:[NSValue valueWithPointer:outputImage] waitUntilDone:YES];

// clean up
CGImageRelease(outputImage);
CGContextRelease(imageContext);
free(imageBuffer);

There's no need to do the malloc/free if you're on iOS 4 or later, you can just pass NULL as the relevant parameter of CGBitmapContextCreate, and let CoreGraphics sort out its own storage.

This differs from the solution you post to because it:

  1. creates a CGImage from a PNG data source — lazy loading applies, so this isn't necessarily a fully loaded and decompressed image
  2. creates a bitmap context of the same size as the PNG
  3. draws the CGImage from the PNG data source onto the bitmap context — this should force full loading and decompression since the actual colour values have to be put somewhere we could access them from a C array. This step is as far as the forceLoad you link to goes.
  4. converts the bitmap context into an image
  5. posts that image off to the main thread, presumably to become a UIImage

So there's no continuity of object between the thing loaded and the thing displayed; pixel data goes through a C array (so, no opportunity for hidden shenanigans) and only if it was put into the array correctly is it possible to make the final image.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...