2012-03-04 89 views
4

我第一次嘗試核心映像(在OS X上,10.7.3),並且正在運行到一堵磚牆。我敢肯定,這是我正在做的傻事,只需要更熟悉框架的人來指出我。「無法識別的選擇器」當試圖訪問CIFilter的outputImage

考慮下面的代碼(我們姑且把imageURL是一個有效的文件URL指向磁盤上的一個JPG):

CIImage *inputImage = [CIImage imageWithContentsOfURL:imageURL]; 
CIFilter *filter = [CIFilter filterWithName:@"CIAreaAverage" keysAndValues: 
                kCIInputImageKey, inputImage, 
                kCIInputExtentKey, [inputImage valueForKey:@"extent"], 
                nil]; 
CIImage *outputImage = (CIImage *)[filter valueForKey:@"outputImage"]; 

運行此代碼,最後一行觸發:

0 CoreFoundation      0x00007fff96c2efc6 __exceptionPreprocess + 198 
1 libobjc.A.dylib      0x00007fff9153cd5e objc_exception_throw + 43 
2 CoreFoundation      0x00007fff96cbb2ae -[NSObject doesNotRecognizeSelector:] + 190 
3 CoreFoundation      0x00007fff96c1be73 ___forwarding___ + 371 
4 CoreFoundation      0x00007fff96c1bc88 _CF_forwarding_prep_0 + 232 
5 CoreImage       0x00007fff8f03c38d -[CIAreaAverage outputImage] + 52 
6 Foundation       0x00007fff991d8384 _NSGetUsingKeyValueGetter + 62 
7 Foundation       0x00007fff991d8339 -[NSObject(NSKeyValueCoding) valueForKey:] + 392 

現在,Core Image Filter Reference明確指出CIAreaAverage「返回包含感興趣區域的平均顏色的單像素圖​​像。」事實上,更令人費解,當我檢查調試器中的過濾器屬性(嘗試valueForKey:呼叫前):

(lldb) po [filter attributes] 
(id) $3 = 0x00007fb3e3ef0e00 { 
    CIAttributeDescription = "Calculates the average color for the specified area in an image, returning the result in a pixel."; 
    CIAttributeFilterCategories =  (
     CICategoryReduction, 
     CICategoryVideo, 
     CICategoryStillImage, 
     CICategoryBuiltIn 
    ); 
    CIAttributeFilterDisplayName = "Area Average"; 
    CIAttributeFilterName = CIAreaAverage; 
    CIAttributeReferenceDocumentation = "http://developer.apple.com/cgi-bin/apple_ref.cgi?apple_ref=//apple_ref/doc/filter/ci/CIAreaAverage"; 
    inputExtent =  { 
     CIAttributeClass = CIVector; 
     CIAttributeDefault = "[0 0 640 80]"; 
     CIAttributeDescription = "A rectangle that specifies the subregion of the image that you want to process."; 
     CIAttributeDisplayName = Extent; 
     CIAttributeType = CIAttributeTypeRectangle; 
     CIUIParameterSet = CIUISetBasic; 
    }; 
    inputImage =  { 
     CIAttributeClass = CIImage; 
     CIAttributeDescription = "The image to process."; 
     CIAttributeDisplayName = Image; 
     CIUIParameterSet = CIUISetBasic; 
    }; 
    outputImage =  { 
     CIAttributeClass = CIImage; 
    }; 
} 

outputImage就在那裏 - 給出型CIImage!

那麼,我做錯了什麼?我所見過的所有文檔和教程都指出-valueForKey:是訪問屬性的正確方式,包括outputImage

回答

7

我相信你的範圍是罪魁禍首(無論如何是奇怪的)。當我將範圍更改爲CIVector *時,它可以工作。

NSURL *imageURL = [NSURL fileURLWithPath:@"/Users/david/Desktop/video.png"]; 
CIImage *inputImage = [CIImage imageWithContentsOfURL:imageURL]; 
CIFilter *filter = [CIFilter filterWithName:@"CIAreaAverage"]; 
[filter setValue:inputImage forKey:kCIInputImageKey]; 
CGRect inputExtent = [inputImage extent]; 
CIVector *extent = [CIVector vectorWithX:inputExtent.origin.x 
             Y:inputExtent.origin.y 
             Z:inputExtent.size.width 
             W:inputExtent.size.height]; 
[filter setValue:extent forKey:kCIInputExtentKey]; 
CIImage *outputImage = [filter valueForKey:@"outputImage"]; 

[inputImage extent]返回一個CGRect,但顯然CIVector *效果更好。

+0

嗯......不是我本來期望。我今天晚上會檢查一下並回復你。謝謝! – 2012-03-15 23:26:12

+0

我很想知道這是否適合你。 – devguydavid 2012-03-27 04:57:29

+0

對不起,我沒有忘記,我剛剛被其他東西淹沒(這是一個「個人好奇」項目)。我保證我會回覆... – 2012-03-27 07:15:55

0

以下是我在iOS應用中取得CIAreaAverage工作:

CGRect inputExtent = [self.inputImage extent]; 
CIVector *extent = [CIVector vectorWithX:inputExtent.origin.x 
             Y:inputExtent.origin.y 
             Z:inputExtent.size.width 
             W:inputExtent.size.height]; 
CIImage* inputAverage = [CIFilter filterWithName:@"CIAreaAverage" keysAndValues:@"inputImage", self.inputImage, @"inputExtent", extent, nil].outputImage; 

//CIImage* inputAverage = [self.inputImage imageByApplyingFilter:@"CIAreaMinimum" withInputParameters:@{@"inputImage" : inputImage, @"inputExtent" : extent}]; 
EAGLContext *myEAGLContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2]; 
NSDictionary *options = @{ kCIContextWorkingColorSpace : [NSNull null] }; 
CIContext *myContext = [CIContext contextWithEAGLContext:myEAGLContext options:options]; 

size_t rowBytes = 32 ; // ARGB has 4 components 
uint8_t byteBuffer[rowBytes]; // Buffer to render into 

[myContext render:inputAverage toBitmap:byteBuffer rowBytes:rowBytes bounds:[inputAverage extent] format:kCIFormatRGBA8 colorSpace:nil]; 

const uint8_t* pixel = &byteBuffer[0]; 
float red = pixel[0]/255.0; 
float green = pixel[1]/255.0; 
float blue = pixel[2]/255.0; 
NSLog(@"%f, %f, %f\n", red, green, blue); 


return outputImage; 
} 
@end 

輸出會是這個樣子:

2015-05-23 15:58:20.935 CIFunHouse[2400:489913] 0.752941, 0.858824, 0.890196 
2015-05-23 15:58:20.981 CIFunHouse[2400:489913] 0.752941, 0.858824, 0.890196 
相關問題