Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
495 views
in Technique[技术] by (71.8m points)

objective c - Detect black pixel in image iOS

As of now I am searching every pixel 1 by 1 checking the color and seeing if it's black... if it isn't I move on to the next pixel. This is taking forever as I can only check approx. 100 pixels per second (speeding up my NSTimer freezes the app because it can't check fast enough.) So is there anyway I can just have Xcode return all the pixels that are black and ignore everything else so I only have to check those pixels and not every pixel. I am trying to detect a black pixel furthest to the left on my image.

Here is my current code.

- (void)viewDidLoad {
    timer = [NSTimer scheduledTimerWithTimeInterval: 0.01
                                             target: self
                                           selector:@selector(onTick:)
                                           userInfo: nil repeats:YES];
    y1 = 0;
    x1 = 0;
    initialImage = 0;
    height1 = 0;
    width1 = 0;
}

-(void)onTick:(NSTimer *)timer {
    if (initialImage != 1) {
        /*
        IMAGE INITIALLY GETS SET HERE... "image2.image = [blah blah blah];" took this out for non disclosure reasons
        */
        initialImage = 1;
    }
    //image2 is the image I'm checking the pixels of.
    width1 = (int)image2.size.width;
    height1 = (int)image2.size.height;
    CFDataRef imageData = CGDataProviderCopyData(CGImageGetDataProvider(image2.CGImage));
    const UInt32 *pixels = (const UInt32*)CFDataGetBytePtr(imageData);
    if ( (pixels[(x1+(y1*width1))]) == 0x000000) { //0x000000 is black right?
        NSLog(@"black!");
        NSLog(@"x = %i", x1);
        NSLog(@"y = %i", y1);
    }else {
        NSLog(@"val: %lu", (pixels[(x1+(y1*width1))]));
        NSLog(@"x = %i", x1);
        NSLog(@"y = %i", y1);
        x1 ++;
        if (x1 >= width1) {
            y1 ++;
            x1 = 0;
        }
    }
    if (y1 > height1) {
        /*
        MY UPDATE IMAGE CODE GOES HERE (IMAGE CHANGES EVERY TIME ALL PIXELS HAVE BEEN CHECKED
        */
        y1 = 0;
        x1 = 0;
    }

Also what if a pixel is really close to black but not perfectly black... Can I add a margin of error in there somewhere so it will still detect pixels that are like 95% black? Thanks!

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

Why are you using a timer at all? Why not just have a double for loop in your function that loops over all possible x- and y-coordinates in the image? Surely that would be waaaay faster than only checking at most 100 pixels per second. You would want to have the x (width) coordinates in the outer loop and the y (height) coordinates in the inner loop so that you are effectively scanning one column of pixels at a time from left to right, since you are trying to find the leftmost black pixel.

Also, are you sure that each pixel in your image has a 4-byte (Uint32) representation? A standard bitmap would have 3 bytes per pixel. To check if a pixel is close to black, you would just examine each byte in the pixel separately and make sure they are all less than some threshold.

EDIT: OK, since you are using UIGetScreenImage, I'm going to assume that it is 4-bytes per pixel.

const UInt8 *pixels = CFDataGetBytePtr(imageData);
UInt8 blackThreshold = 10; // or some value close to 0
int bytesPerPixel = 4;
for(int x = 0; x < width1; x++) {
  for(int y = 0; y < height1; y++) {
    int pixelStartIndex = (x + (y * width1)) * bytesPerPixel;
    UInt8 alphaVal = pixels[pixelStartIndex]; // can probably ignore this value
    UInt8 redVal = pixels[pixelStartIndex + 1];
    UInt8 greenVal = pixels[pixelStartIndex + 2];
    UInt8 blueVal = pixels[pixelStartIndex + 3];
    if(redVal < blackThreshold && blueVal < blackThreshold && greenVal < blackThreshold) {
      //This pixel is close to black...do something with it
    }
  }
}

If it turns out that bytesPerPixel is 3, then change that value accordingly, remove the alphaVal from the for loop, and subtract 1 from the indices of the red, green, and blue values.

Also, my current understanding is that UIGetScreenImage is considered a private function that Apple may or may not reject you for using.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

1.4m articles

1.4m replys

5 comments

57.0k users

...