我使用过WebRTC。我想将本地视频流发送到文件。 如果你给我一个提示来解决这个问题,我会很感激。
感谢您阅读。
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
//sigConnect("http://10.54.36.19:8000/");
sigConnect("http://unwebrtc.herokuapp.com/");
initWebRTC();
Log.i(TAG, "VideoCapturerAndroid.getDeviceCount() = " + VideoCapturerAndroid.getDeviceCount());
String nameOfFrontFacingDevice = VideoCapturerAndroid.getNameOfFrontFacingDevice();
String nameOfBackFacingDevice = VideoCapturerAndroid.getNameOfBackFacingDevice();
Log.i(TAG, "VideoCapturerAndroid.getNameOfFrontFacingDevice() = " + nameOfFrontFacingDevice);
Log.i(TAG, "VideoCapturerAndroid.getNameOfBackFacingDevice() = " + nameOfBackFacingDevice);
VideoCapturerAndroid capturer = VideoCapturerAndroid.create(nameOfFrontFacingDevice);
MediaConstraints videoConstraints = new MediaConstraints();
VideoSource videoSource = peerConnectionFactory.createVideoSource(capturer, videoConstraints);
localVideoTrack = peerConnectionFactory.createVideoTrack(VIDEO_TRACK_ID, videoSource);
glview = (GLSurfaceView) findViewById(R.id.glview);
VideoRendererGui.setView(glview, null);
try {
rendereRemote = VideoRendererGui.createGui(0, 0, 100, 100, VideoRendererGui.ScalingType.SCALE_ASPECT_FILL, true);
rendereLocal = VideoRendererGui.createGui(72, 72, 25, 25, VideoRendererGui.ScalingType.SCALE_ASPECT_FILL, true);
localVideoTrack.addRenderer(rendereLocal);
} catch (Exception e) {
e.printStackTrace();
}
mediaStream = peerConnectionFactory.createLocalMediaStream(LOCAL_MEDIA_STREAM_ID);
mediaStream.addTrack(localVideoTrack);
}
答案 0 :(得分:0)
Libjingle库使用GlSurfaceView渲染视频。您可以尝试使用FFMPEG库来保存该视图中的视频帧。虽然不确定音频
答案 1 :(得分:0)
您必须创建视频容器(如mp4),并手动编码和写入每个原始帧。也是最新的webrtc版本,可用于从麦克风录制音频。您还应该编码和混合音频样本。