我已经设法使用 mediaDevices 从我的 React Native 客户端检索视频流,而且 React Native 客户端正在连接到我的 NodeJS 服务器。连接后,服务器响应客户端说客户端已连接到 NodeJS 服务器。但是一旦连接,我想发送我从 mediaDevices 获取的视频流并将视频流发送到 NodeJS 服务器,然后作为回报再次将视频流从 NodeJS 服务器发送回 React Native 客户端。
您可能会问为什么不直接将视频流式传输到客户端,而不是将其发送到服务器并再次返回到客户端。
原因是我只是想学习如何使用socket.io和socket.io-client来发送和接收视频流。
这是一些代码以及我如何尝试实现此 atm (Sevrer.js):
io.on("connection", socket => {
if(!patients[socket]){
patients[socket.id] = socket;
patients[socket.id].emit('PatientConnected', 'Success');
console.log(`a user connected :D ${socket.id}`);
}else{
}
patients[socket.id].on('stream', function(data) {
console.log("stream from client " + data.stream);
var bufArr = new ArrayBuffer(data.stream);
var bufView = new Uint8Array(bufArr);
patients[socket.id].emit("ReturnStream", bufView);
});
socket.on('error', function() {
delete patients[socket.id];
});
socket.on('disconnect', function() {
delete patients[socket.id];
});
});
还有我的客户端(App.js):
const socket = io.connect("http://192.168.1.107:3000", {reconnect: true});
console.log("socket: " + socket);
socket.on("PatientConnected", (msg) => {
if(msg == 'Success'){
console.log("socket connected: " + socket.id);
}
});
socket.on("ReturnStream", (stream) => {
console.log("ReturnStream: " + stream);
var bufView = new Uint8Array(stream);
this.setState({ stream : bufView})
});
const isFront = true;
const facing = isFront ? 'front' : 'environment';
mediaDevices.enumerateDevices().then(sourceInfos => {
let videoSourceId;
for (let i = 0; i < sourceInfos.length; i++) {
const sourceInfo = sourceInfos[i];
if(sourceInfo.kind == "videoinput" && sourceInfo.facing == (isFront ? "front" : "back")) {
videoSourceId = sourceInfo.deviceId;
}
}
mediaDevices.getUserMedia({
audio: true,
video: {
mandatory: {
minWidth: 500,
minHeight: 300,
minFrameRate: 30
},
facingMode: (isFront ? "user" : "environment"),
optional: (videoSourceId ? [{sourceId: videoSourceId}] : [])
}
})
.then(stream => {
if(socket){
console.log('stream' + stream);
socket.emit('stream', {stream: stream});
}
})
.catch(error => {
// Log error
});
});
我在 bufView 中从“var bufView = new Uint8Array(stream);”得到未定义,我哪里出错了或者我遗漏了什么?。我是 socket.io 和 socket.io-client 的新手,这些是允许系统发送和接收视频流的库吗?以及如何做到这一点?