在詳細回顧了WWDC2014 Session513之後,我嘗試在IOS8.0上編寫我的應用程序來解碼和顯示一個實時H.264流。首先,我成功構建了H264參數集。當我得到一個4位起始碼的幀時,就像「0x00 0x00 0x00 0x01 0x65 ...」,我把它放到一個CMblockBuffer中。然後我使用預覽CMBlockBuffer構造一個CMSampleBuffer。之後,我將CMSampleBuffer放入AVSampleBufferDisplayLayer中。一切正常(我檢查了返回的值),除了AVSampleBufferDisplayLayer不顯示任何視頻圖像。由於這些API對每個人來說都是相當新穎的,所以我找不到任何可以解決此問題的機構。將H.264 I幀放到AVSampleBufferDisplayLayer中,但沒有顯示視頻圖像
我給出的關鍵代碼如下,我真的很感激它,如果你能幫助找出爲什麼無法顯示視頻圖像。非常感謝。
(1)AVSampleBufferDisplayLayer初始化。 dsplayer是我的主視圖控制器的objc實例。
@property(nonatomic,strong)AVSampleBufferDisplayLayer *dspLayer;
if(!_dspLayer)
{
_dspLayer = [[AVSampleBufferDisplayLayer alloc]init];
[_dspLayer setFrame:CGRectMake(90,551,557,389)];
_dspLayer.videoGravity = AVLayerVideoGravityResizeAspect;
_dspLayer.backgroundColor = [UIColor grayColor].CGColor;
CMTimebaseRef tmBase = nil;
CMTimebaseCreateWithMasterClock(NULL,CMClockGetHostTimeClock(),&tmBase);
_dspLayer.controlTimebase = tmBase;
CMTimebaseSetTime(_dspLayer.controlTimebase, kCMTimeZero);
CMTimebaseSetRate(_dspLayer.controlTimebase, 1.0);
[self.view.layer addSublayer:_dspLayer];
}
(2)在另一個線程中,我得到一個H.264 I幀。 //構造H.264參數組確定
CMVideoFormatDescriptionRef formatDesc;
OSStatus formatCreateResult =
CMVideoFormatDescriptionCreateFromH264ParameterSets(NULL, ppsNum+1, props, sizes, 4, &formatDesc);
NSLog([NSString stringWithFormat:@"construct h264 param set:%ld",formatCreateResult]);
//構造cmBlockbuffer。// databuf指向H.264數據。開始於 「0×00 0×00 0×00 0×01 0x65 ........」
CMBlockBufferRef blockBufferOut = nil;
CMBlockBufferCreateEmpty (0,0,kCMBlockBufferAlwaysCopyDataFlag, &blockBufferOut);
CMBlockBufferAppendMemoryBlock(blockBufferOut,
dataBuf,
dataLen,
NULL,
NULL,
0,
dataLen,
kCMBlockBufferAlwaysCopyDataFlag);
//構建cmsamplebuffer確定
size_t sampleSizeArray[1] = {0};
sampleSizeArray[0] = CMBlockBufferGetDataLength(blockBufferOut);
CMSampleTiminginfo tmInfos[1] = {
{CMTimeMake(5,1), CMTimeMake(5,1), CMTimeMake(5,1)}
};
CMSampleBufferRef sampBuf = nil;
formatCreateResult = CMSampleBufferCreate(kCFAllocatorDefault,
blockBufferOut,
YES,
NULL,
NULL,
formatDesc,
1,
1,
tmInfos,
1,
sampleSizeArray,
&sampBuf);
//投入AVSampleBufferdisplayLayer,只是一個框架。但我看不到任何視頻框架在我看來
if([self.dspLayer isReadyForMoreMediaData])
{
[self.dspLayer enqueueSampleBuffer:sampBuf];
}
[self.dspLayer setNeedsDisplay];
Scythe42答案可能會解決你的問題。我也遇到了一些問題,使其工作。但最後我做到了。你應該[看看](http://stackoverflow.com/questions/25980070/how-to-use-avsamplebufferdisplaylayer-in-ios-8-for-rtp-h264-streams-with-gstream)。 – Zappel 2014-10-29 19:09:07
也一樣。有一個vail&ready CMSampleBuffer,但它不會顯示在屏幕上...... :( – zaxy78 2017-11-01 14:15:19