我正在使用JNI在Android中使用OpenCV 2.4.8。
我用VideoCapture打开相机,我想录制它。我在cv :: Mat中有图像,它在屏幕上正确显示。
但是,当我尝试打开VideoCapture时,它总是返回false。
// Camera resolution is 640x480 and its fine.
_camera_resolution = calc_optimal_camera_resolution(u.name, 640, 480);
// Store in sdcard, I have permission in AndroidManifest.xml
const char * videoName = "/sdcard/videoTest.avi";
// Segmentation fault 11! this method doesnt work in Android? so commented
//const int ex = static_cast<int>(_reader.get(CV_CAP_PROP_FOURCC));
// Segmentation fault 11! too!!! so commented
//const double fps = _reader.get(CV_CAP_PROP_FPS);
// Try to open
_isRecording = _writer.open(videoName, -1, 30, _camera_resolution, true);
// Return FALSE always
if(!_writer.isOpened())
{
LOGE("rec - Error opening video writer");
}
else
{
LOGD("rec - Video writer opened in startRecording");
}
我试图用作FOURCC:
CV_FOURCC('M','J','P','G')和CV_FOURCC('M','P','4','V')//它不起作用!
我尝试过不同的fps比率,15.0,30.0 ......
相机分辨率似乎有效,因为我打印时该值是正确的。
为什么不能正确打开?
答案 0 :(得分:2)
据我所知,OpenCV4adnroid不支持视频读写。 尝试使用编码器选项重建您的Opencv: (例如:WITH_FFMPEG = YES或WITH_VFW = YES)
或者尝试保存一系列图像,然后根据java代码对此序列中的视频进行编码 我尝试了下一个命题(ref)
public void imageToMP4(BufferedImage bi) {
// A transform to convert RGB to YUV colorspace
RgbToYuv420 transform = new RgbToYuv420(0, 0);
// A JCodec native picture that would hold source image in YUV colorspace
Picture toEncode = Picture.create(bi.getWidth(), bi.getHeight(), ColorSpace.YUV420);
// Perform conversion
transform.transform(AWTUtil.fromBufferedImage(bi), yuv);
// Create MP4 muxer
MP4Muxer muxer = new MP4Muxer(sink, Brand.MP4);
// Add a video track
CompressedTrack outTrack = muxer.addTrackForCompressed(TrackType.VIDEO, 25);
// Create H.264 encoder
H264Encoder encoder = new H264Encoder(rc);
// Allocate a buffer that would hold an encoded frame
ByteBuffer _out = ByteBuffer.allocate(ine.getWidth() * ine.getHeight() * 6);
// Allocate storage for SPS/PPS, they need to be stored separately in a special place of MP4 file
List<ByteBuffer> spsList = new ArrayList<ByteBuffer>();
List<ByteBuffer> ppsList = new ArrayList<ByteBuffer>();
// Encode image into H.264 frame, the result is stored in '_out' buffer
ByteBuffer result = encoder.encodeFrame(_out, toEncode);
// Based on the frame above form correct MP4 packet
H264Utils.encodeMOVPacket(result, spsList, ppsList);
// Add packet to video track
outTrack.addFrame(new MP4Packet(result, 0, 25, 1, 0, true, null, 0, 0));
// Push saved SPS/PPS to a special storage in MP4
outTrack.addSampleEntry(H264Utils.createMOVSampleEntry(spsList, ppsList));
// Write MP4 header and finalize recording
muxer.writeHeader();
}
您可以从项目网站或通过Maven下载JCodec库,为此将以下代码段添加到您的pom.xml:
<dependency>
<groupId>org.jcodec</groupId>
<artifactId>jcodec</artifactId>
<version>0.1.3</version>
</dependency>
机器人: Android用户可以使用类似下面的内容将Android Bitmap对象转换为JCodec原生格式:
public static Picture fromBitmap(Bitmap src) {
Picture dst = Picture.create((int)src.getWidth(), (int)src.getHeight(), RGB);
fromBitmap(src, dst);
return dst;
}
public static void fromBitmap(Bitmap src, Picture dst) {
int[] dstData = dst.getPlaneData(0);
int[] packed = new int[src.getWidth() * src.getHeight()];
src.getPixels(packed, 0, src.getWidth(), 0, 0, src.getWidth(), src.getHeight());
for (int i = 0, srcOff = 0, dstOff = 0; i < src.getHeight(); i++) {
for (int j = 0; j < src.getWidth(); j++, srcOff++, dstOff += 3) {
int rgb = packed[srcOff];
dstData[dstOff] = (rgb >> 16) & 0xff;
dstData[dstOff + 1] = (rgb >> 8) & 0xff;
dstData[dstOff + 2] = rgb & 0xff;
}
}
}
答案 1 :(得分:1)
('M','J','P','G')是android在使用.avi ext时唯一支持的。最重要的是#include stdio.h,没有这个你就无法打开VideoWriter视频
答案 2 :(得分:0)
cv::VideoWriter writer;
writer.open("your_mp4_file_path", cv::VideoWriter::fourcc('H', '2', '6', '4'),
15, //framerate
cv::Size(720, 1280),
true);
writer << mat_frame;
// remember writer.release() when finish
android opencv 4.5.2,带有 ffmpeg + openh264 的 opencv,对我来说效果很好