I'm not asking about the exact code but the overall idea.
Here is my problem: I'm trying to create something similar to filter choosing UI in Photos app. I've tried multiple approaches and all of them have their drawbacks.
1) I've tried using Operation
and OperationQueue
with a collection view, which prefetching is enabled. This loads the viewController fast but drops frames while scrolling.
2) Right now I'm using a scroll view and GCD
but it loads the viewController too long (because it applies all filters to all the buttons inside it at once), but then it scrolls smoothly.
NOTE: To answer the question, there is no need to read the below part (I believe), however if you are interested in how I try to implement the functionality you're welcome to read it.
For the implementation of all filters I use a struct called Filters
which is responsible for initiating each filter and appending it to an array.
struct Filters {
var image: UIImage
var allFilters: [CIFilter] = []
init(image: UIImage) {
self.image = image
guard let sepia = Sepia(image: image) else {return}
allFilters.append(contentsOf: [sepia, sepia, sepia, sepia, sepia, sepia, sepia, sepia, sepia, sepia, sepia, sepia, sepia])
}
}
Right now I'm using only one filter. Sepia
is a subclass of CIFilter
. I've created it as a subclass because in future I'm going to create a custom one from it. Here is its implementation:
class Sepia: CIFilter {
var inputImage: CIImage?
var inputIntensity: NSNumber?
@objc override var filterName: String? {
return NSLocalizedString("Sepia", comment: "Name of a Filter")
}
convenience init?(image: UIImage, inputIntensity: NSNumber? = nil) {
self.init()
guard let cgImage = image.cgImage else {
return nil
}
if inputIntensity != nil {
self.inputIntensity = inputIntensity
} else {
self.setDefaults()
}
let inputImage = CIImage(cgImage: cgImage)
self.inputImage = inputImage
}
override func setDefaults() {
inputIntensity = 1.0
}
override var outputImage: CIImage? {
guard let inputImage = inputImage, let inputIntensity = inputIntensity else {
return nil
}
let filter = CIFilter(name: "CISepiaTone", withInputParameters: [kCIInputImageKey: inputImage, kCIInputIntensityKey: inputIntensity])
return filter?.outputImage
}
}
In the viewController's viewDidLoad
I initiate Filters
struct:
self.filters = Filters(image: image)
Then I call a method that configures some views (filterViews
) based on the number of filters in the filters.allFilters
array and iterates over them and calls a method which takes a thumbnail UIImage
and applies a filter to it and then returns it in a completion handler (I use DispatchGroup
in it for debugging reasons). Here is the method that applies a filter to a thumbnail:
func imageWithFilter(filter: CIFilter, completion: @escaping(UIImage?)->Void) {
let group = DispatchGroup()
group.enter()
DispatchQueue.global().async {
guard let outputImage = filter.value(forKey: kCIOutputImageKey) as? CIImage, let cgImageResult = self.context.createCGImage(outputImage, from: outputImage.extent) else {
DispatchQueue.main.async {
completion(nil)
}
group.leave()
return
}
let filteredImage = UIImage(cgImage: cgImageResult)
DispatchQueue.main.async {
print (filteredImage)
completion(filteredImage)
}
group.leave()
}
group.notify(queue: .main) {
print ("Filteres are set")
}
}
The print statement above and the filtered image address are printed quite soon, however the images don't appear inside the views.
I have tried to use Time Profiler
but it gives me some weird results. For example, it shows the following as taking quite long to execute in the root of backtrace:
When I try to see the code in Xcode, I get the following, which doesn't help much:
So, this is the problem. If you have any ideas how it is implemented in the Photos app that it is so fast and responsive, or if you have suggestions about my implementation, I would highly appreciate your help.
See Question&Answers more detail:
os