我制作了一个视频播放器,正在分析来自正在播放的视频的实时音频和视频轨道.视频存储在iOS设备上(在Apps文档目录中).
这一切都很好.我使用MTAudioProcessingTap来获取所有的音频样本并进行一些FFT,我正在分析视频,只需复制当前播放的CMTime(AVPlayer currentTime属性)的像素缓冲区.正如我所说,这工作正常.
但现在我想支持Airplay. Airplay本身并不困难,但是一旦Airplay被切换并且视频正在ATV上,我的点击就停止工作.不知何故,MTAudioProcessingTap不会处理,pixelbuffers都是空的…我无法获取数据.
有没有办法得到这些数据?
为了获得像素缓冲区,我只需每隔几毫秒触发一次事件并检索播放器的currentTime.然后:
CVPixelBufferRef imageBuffer = [videoOutput copyPixelBufferForItemTime:time itemTimeFordisplay:nil]; CVPixelBufferLockBaseAddress(imageBuffer,0); uint8_t *tempAddress = (uint8_t *) CVPixelBufferGetBaseAddress(imageBuffer); size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); size_t height = CVPixelBufferGetHeight(imageBuffer); CVPixelBufferUnlockBaseAddress(imageBuffer,0);
其中tempAddress是我的pixelbuffer,而videoOutput是AVPlayerItemVideoOutput的一个实例.
对于音频,我使用:
AVMutableAudioMixInputParameters *inputParams = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:audioTrack];
// Create a processing tap for the input parameters
MTAudioProcessingTapCallbacks callbacks;
callbacks.version = kMTAudioProcessingTapCallbacksversion_0;
callbacks.clientInfo = (__bridge void *)(self);
callbacks.init = init;
callbacks.prepare = prepare;
callbacks.process = process;
callbacks.unprepare = unprepare;
callbacks.finalize = finalize;
MTAudioProcessingTapRef tap;
Osstatus err = MTAudioProcessingTapCreate(kcfAllocatorDefault,&callbacks,kMTAudioProcessingTapCreationFlag_PostEffects,&tap);
if (err || !tap) {
NSLog(@"Unable to create the Audio Processing Tap");
return;
}
inputParams.audioTapProcessor = tap;
// Create a new AVAudioMix and assign it to our AVPlayerItem
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = @[inputParams];
playerItem.audioMix = audioMix;
问候,
NIEK
解决方法
这里的解决方案:
这是为了实现AirPlay,我使用这个代码仅在我的应用程序的音频,我不知道如果你可以改进视频,但你可以尝试;)
在AppDelegate.m上:
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
[RADStyle applyStyle];
[radiosound superclass];
[self downloadZip];
NSError *sessionError = nil;
[[AVAudioSession sharedInstance] setDelegate:self];
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:&sessionError];
[[AVAudioSession sharedInstance] setActive:YES error:nil];
UInt32 sessionCategory = kAudioSessionCategory_mediaplayback;
AudioSessionSetProperty(kAudioSessionProperty_AudioCategory,sizeof(sessionCategory),&sessionCategory);
UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
AudioSessionSetProperty (kAudioSessionProperty_OverrideAudioRoute,sizeof (audioRouteOverride),&audioRouteOverride);
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
}
如果您使用airplay来实现LockScreen控件,ArtWork,停止/播放,标题ecc.
您在播放器的DetailViewController中使用此代码:
- (BOOL)canBecomeFirstResponder {
return YES;
}
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
[self becomeFirstResponder];
NSData* imageData = [[NSData alloc] initWithContentsOfURL:[NSURL URLWithString: (self.saved)[@"image"]]];
if (imageData == nil){
MPNowPlayingInfoCenter *infoCenter = [MPNowPlayingInfoCenter defaultCenter];
MPMediaItemArtwork *albumart = [[MPMediaItemArtwork alloc] initWithImage:[UIImage imageNamed:@"lockScreen.png"]];
infoCenter.NowPlayingInfo = @{MPMediaItemPropertyTitle: saved[@"web"],MPMediaItemPropertyArtist: saved[@"title"],MPMediaItemPropertyArtwork:albumart};
} else {
MPNowPlayingInfoCenter *infoCenter = [MPNowPlayingInfoCenter defaultCenter];
MPMediaItemArtwork *albumart = [[MPMediaItemArtwork alloc] initWithImage:[UIImage imageWithData:imageData]];
infoCenter.NowPlayingInfo = @{MPMediaItemPropertyTitle: saved[@"link"],MPMediaItemPropertyArtwork:albumart};
}
}
希望这段代码可以帮助你)