我正在使用GPUImage和许多GPUImageView实例。目的是显示原始图像,在顶部层叠几片过滤图像,最后在原始图像上缓慢地对切片过滤器进行动画处理。想象一下,一些图像上有一些棕褐色条纹滚动,以显示正常图像和棕褐色图像。
我将此功能包装在UIView的子类中,如下所示:
import Foundation
import QuartzCore
class FilteredImageMaskView : UIView {
init(frame: CGRect, image: UIImage){
super.init(frame: frame);
let imageViewFrame = CGRectMake(frame.origin.x, 0.0, frame.size.width, frame.size.height);
let origImage = GPUImagePicture(image: image);
origImage.forceProcessingAtSizeRespectingAspectRatio(imageViewFrame.size);
// Display the original image without a filter
let imageView = GPUImageView(frame: imageViewFrame);
origImage.addTarget(imageView);
origImage.processImageWithCompletionHandler(){
origImage.removeAllTargets();
var contentMode = UIViewContentMode.ScaleAspectFit;
imageView.contentMode = contentMode;
// Width of the unfiltered region
let regularWidth: CGFloat = 30.0;
// Width of filtered region
let filterWidth: CGFloat = 30.0;
// How much we are moving each bar
let totalXMovement = (regularWidth + filterWidth) * 2;
// The start X position
var currentXForFilter: CGFloat = -totalXMovement;
// The filter being applied to an image
let filter = GPUImageSepiaFilter();
filter.intensity = 0.5;
// Add the filter to the originalImage
origImage.addTarget(filter);
let filteredViewCollection = FilteredViewCollection(filteredViews: [GPUImageView]());
// Iterate over the X positions until the whole image is covered
while(currentXForFilter < imageView.frame.width + totalXMovement){
let frame = CGRectMake(currentXForFilter, imageViewFrame.origin.y, imageViewFrame.width, imageViewFrame.height);
var filteredView = GPUImageView(frame: frame);
filteredView.clipsToBounds = true;
filteredView.layer.contentsGravity = kCAGravityTopLeft;
// This is the slice of the overall image that we are going to display as filtered
filteredView.layer.contentsRect = CGRectMake(currentXForFilter / imageViewFrame.width, 0.0, filterWidth / imageViewFrame.width, 1.0);
filteredView.fillMode = kGPUImageFillModePreserveAspectRatio;
filter.addTarget(filteredView);
// Add the filteredView to the super view
self.addSubview(filteredView);
// Add the filteredView to the collection so we can animate it later
filteredViewCollection.filteredViews.append(filteredView);
// Increment the X position
currentXForFilter += regularWidth + filterWidth;
}
origImage.processImageWithCompletionHandler(){
filter.removeAllTargets();
// Move to the UI thread
ThreadUtility.runOnMainThread(){
// Add the unfiltered image
self.addSubview(imageView);
// And move it behind the filtered slices
self.sendSubviewToBack(imageView);
// Animate the slices slowly across the image
UIView.animateWithDuration(20.0, delay: 0.0, options: UIViewAnimationOptions.Repeat, animations: { [weak filteredViewCollection] in
if let strongfilteredViewCollection = filteredViewCollection {
if(strongfilteredViewCollection.filteredViews != nil){
for(var i = 0; i < strongfilteredViewCollection.filteredViews.count; i++){
strongfilteredViewCollection.filteredViews[i].frame.origin.x += totalXMovement;
strongfilteredViewCollection.filteredViews[i].layer.contentsRect.origin.x += (totalXMovement / imageView.frame.width);
}
}
}
}, completion: nil);
}
}
}
}
required init(coder aDecoder: NSCoder) {
super.init(coder: aDecoder);
}
}
class FilteredViewCollection {
var filteredViews: [GPUImageView]! = [GPUImageView]();
init(filteredViews: [GPUImageView]!){
self.filteredViews = filteredViews;
}
}
以编程方式将FilteredImageMaskView
的实例添加到viewController中的视图。当viewController被解雇时,假设资源将被处理 - 我小心避免保留周期。当我在真实设备上观察调试器中的内存消耗时,当viewController被关闭时,内存会正确地下降。但是,如果我反复加载viewController来查看图像,然后将其关闭,然后重新加载它,我最终会遇到&#34;应用程序由于内存错误而被终止&#34;
如果我在解除viewController后等待一段时间,内存错误似乎不那么频繁,这让我相信在viewController被解除后内存仍然被释放...?但是,我只看到了几次没有那么快速打开和关闭viewController的错误。
我必须低效地使用GPUImage和/或GPUImageView,我正在寻找指导。
谢谢!
编辑:请参阅下面的视图控制器实现。
import UIKit
class ViewImageViewController: UIViewController, FetchImageDelegate {
var imageManager = ImageManager();
@IBOutlet var mainView: UIView!
override func viewDidLoad() {
super.viewDidLoad()
imageManager.fetchImageAsync(delegate: self);
}
// This callback is dispatched on the UI thread
func imageFetchCompleted(imageData: [UInt8]) {
let imageView = FilteredImageMaskView(frame: self.mainView.frame, image: UIImage(data: imageData));
mainView.addSubview(imageView);
var timer = NSTimer.scheduledTimerWithTimeInterval(NSTimeInterval(10.0), target: self, selector: Selector("displayReminder"), userInfo: nil, repeats: false);
}
func displayReminder(){
// Show an alert or message here
}
}
class ImageManager {
func fetchImageAsync(delegate: FetchImageDelegate) {
// This dispatches a high priority background thread
ThreadUtility.runOnHighPriorityBackgroundThread() { [weak delegate] in
// Get the image (This part could take a while in the real implementation)
var imageData = [UInt8]();
// Move to the UI thread
ThreadUtility.runOnMainThread({
if let strongDelegate = delegate {
strongDelegate.imageFetchCompleted(imageData);
}
});
}
}
}
现在我正在查看这个精简版本,是否将self
传递给ImageManager
创建一个保留周期,即使我将它weak
引用到后台线程?我可以将其作为ViewImageViewController
的弱引用传递给我吗?在ViewImageViewController
方法完成并调用回调之前,fetchImageAsync
可能会被解除。
编辑:我认为我发现了这个问题。如果你查看回调中的ViewImageViewController
,我会创建一个NSTimer并传递self。我怀疑是如果在定时器执行之前解除viewController,则创建一个保留周期。这可以解释为什么如果我等待几秒钟,我不会得到内存错误 - 因为计时器触发并且viewController正确处理。这是修复(我认为)。
// This is on the ViewImageViewController
var timer: NSTimer!;
// Then instead of creating a new variable, assign the timer to the class variable
self.timer = NSTimer.scheduledTimerWithTimeInterval(NSTimeInterval(10.0), target: self, selector: Selector("displayReminder"), userInfo: nil, repeats: false);
// And finally, on dismiss of the viewcontroller (viewWillDisappear or back button click event, or both)
func cancelTimer() {
if(self.timer != nil){
self.timer.invalidate();
self.timer = nil;
}
}
答案 0 :(得分:0)
FilteredImageMaskView在processImageWithCompletionHandler块中被强烈引用,这可能会形成一个保留周期。尝试在块
中使用弱自我