我正在尝试使用MediaCodec来检索视频中的所有帧以进行图像处理,我正在尝试渲染视频并从outBuffers中捕获帧
但我无法从接收的字节中启动位图实例.
但我无法从接收的字节中启动位图实例.
我试图将它渲染到曲面或没有任何东西(null),因为我注意到当渲染为null时,outBuffers将获取渲染帧的字节.
这是代码:
private static final String SAMPLE = Environment.getExternalStorageDirectory() + "/test_videos/sample2.mp4";
private PlayerThread mPlayer = null;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
SurfaceView sv = new SurfaceView(this);
sv.getHolder().addCallback(this);
setContentView(sv);
}
protected void onDestroy() {
super.onDestroy();
}
@Override
public void surfaceCreated(SurfaceHolder holder) {
}
@Override
public void surfaceChanged(SurfaceHolder holder,int format,int width,int height) {
if (mPlayer == null) {
mPlayer = new PlayerThread(holder.getSurface());
mPlayer.start();
}
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
if (mPlayer != null) {
mPlayer.interrupt();
}
}
private void writeFrametoSDCard(byte[] bytes,int i,int sampleSize) {
try {
Bitmap bmp = BitmapFactory.decodeByteArray(bytes,sampleSize);
File file = new File(Environment.getExternalStorageDirectory() + "/test_videos/sample" + i + ".png");
if (file.exists())
file.delete();
file.createNewFile();
FileOutputStream out = new FileOutputStream(file.getAbsoluteFile());
bmp.compress(Bitmap.CompressFormat.PNG,90,out);
out.close();
} catch (Exception e) {
e.printstacktrace();
}
}
private class PlayerThread extends Thread {
private MediaExtractor extractor;
private MediaCodec decoder;
private Surface surface;
public PlayerThread(Surface surface) {
this.surface = surface;
}
@Override
public void run() {
extractor = new MediaExtractor();
extractor.setDataSource(SAMPLE);
int index = extractor.getTrackCount();
Log.d("MediaCodecTag","Track count: " + index);
for (int i = 0; i < extractor.getTrackCount(); i++) {
MediaFormat format = extractor.getTrackFormat(i);
String mime = format.getString(MediaFormat.KEY_MIME);
if (mime.startsWith("video/")) {
extractor.selectTrack(i);
decoder = MediaCodec.createDecoderByType(mime);
decoder.configure(format,surface,null,0);
break;
}
}
if (decoder == null) {
Log.e("DecodeActivity","Can't find video info!");
return;
}
decoder.start();
ByteBuffer[] inputBuffers = decoder.getInputBuffers();
ByteBuffer[] outputBuffers = decoder.getoutputBuffers();
BufferInfo info = new BufferInfo();
boolean iSEOS = false;
long startMs = System.currentTimeMillis();
int i = 0;
while (!Thread.interrupted()) {
if (!iSEOS) {
int inIndex = decoder.dequeueInputBuffer(10000);
if (inIndex >= 0) {
ByteBuffer buffer = inputBuffers[inIndex];
int sampleSize = extractor.readSampleData(buffer,0);
if (sampleSize < 0) {
decoder.queueInputBuffer(inIndex,MediaCodec.BUFFER_FLAG_END_OF_STREAM);
iSEOS = true;
} else {
decoder.queueInputBuffer(inIndex,sampleSize,extractor.getSampleTime(),0);
extractor.advance();
}
}
}
/* saves frame to sdcard */
int outIndex = decoder.dequeueOutputBuffer(info,10000); // outIndex most of the times null
switch (outIndex) {
case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:
Log.d("DecodeActivity","INFO_OUTPUT_BUFFERS_CHANGED");
outputBuffers = decoder.getoutputBuffers();
break;
case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
Log.d("DecodeActivity","New format " + decoder.getoutputFormat());
break;
case MediaCodec.INFO_TRY_AGAIN_LATER:
Log.d("DecodeActivity","dequeueOutputBuffer timed out!");
break;
default:
ByteBuffer buffer = outputBuffers[outIndex];
Log.v("DecodeActivity","We can't use this buffer but render it due to the API limit," + buffer);
// We use a very simple clock to keep the video FPS,or the video
// playback will be too fast
while (info.presentationTimeUs / 1000 > System.currentTimeMillis() - startMs) {
try {
sleep(10);
} catch (InterruptedException e) {
e.printstacktrace();
break;
}
}
decoder.releaSEOutputBuffer(outIndex,true);
try {
byte[] dst = new byte[outputBuffers[outIndex].capacity()];
outputBuffers[outIndex].get(dst);
writeFrametoSDCard(dst,i,dst.length);
i++;
} catch (Exception e) {
Log.d("iDecodeActivity","Error while creating bitmap with: " + e.getMessage());
}
break;
}
// All decoded frames have been rendered,we can stop playing Now
if ((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
Log.d("DecodeActivity","OutputBuffer BUFFER_FLAG_END_OF_STREAM");
break;
}
}
decoder.stop();
decoder.release();
extractor.release();
}
}
任何帮助都会很有帮助
解决方法
您可以解码到Surface或ByteBuffer,但不能同时解码.因为您正在配置Surface,所以输出缓冲区中始终会有零字节的数据.
如果你配置ByteBuffer解码,数据格式会有所不同,但据我所知,它永远不会是Bitmap理解的ARGB格式.您可以在方法checkFrame()中的CTS EncodeDecodeTest中的缓冲区到缓冲区测试中查看正在检查的两种YUV格式的示例.但请注意,它首先要做的是检查格式,如果无法识别则立即返回.
目前(Android 4.4),唯一可行的方法是解码为SurfaceTexture,使用GLES渲染,然后使用glreadPixels()提取RGB数据. bigflake上提供了示例代码 – 请参阅ExtractMpegFramesTest(需要API 16).