I am writing a .jpg file to my app's Documents directory like this:
NSData *img = UIImageJPEGRepresentation(myUIImage, 1.0);
BOOL retValue = [img writeToFile:myFilePath atomically:YES];
Later, I load that image back into a UIImage using:
UIImage *myImage = [UIImage imageWithContentsOfFile:path];
I know it works because I can draw the image in a table cell and it is fine. Now if I try to use UIImageJPEGRepresentation(myImage, 1.0), the debugger prints out these lines:
<Error>: Not a JPEG file: starts with 0xff 0xd9
<Error>: Application transferred too few scanlines
And the function returns nil. Does anybody have an idea why this would happen? I haven't done anything to manipulate the UIImage data after it was loaded. I just provided the UIImage to an image view in a cell. I set the image view properties such that all the images in the cells line up and are the same size, but I don't think that should have anything to do with being able to convert the UIImage to NSData.
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…