我正在尝试执行有关视频流的证明,我正在使用asp.net c#。 我有点迷茫,你有什么想法或建议吗?
答案 0 :(得分:25)
不,SignalR基于标准(WebSockets,LongPolling,ForeverFrame等),它们只传输基于文本的JSON消息。你可能最好不要看WebRTC specification。现在,您可以通过发送带有SignalR的控制消息将这两种技术结合在一起,这些消息会触发一些改变浏览器当前显示的WebRTC源的JavaScript。
答案 1 :(得分:21)
我在SignalR上实现了视频流。您可以在http://weblogs.asp.net/ricardoperes/archive/2014/04/24/video-streaming-with-asp-net-signalr-and-html5.aspx找到我的示例。
答案 2 :(得分:0)
我不知道SignalR是否打算用于视频流,但是SignalR是客户端对客户端,客户端对服务器和服务器对客户端之间的集线器容器。如果我想进行视频聊天,为什么不能将其用作集线器?无论如何,SignalR也可以处理字节数组,不仅可以处理字符串,还可以通过将每个帧作为byte [](流)发送来进行尝试。至少当我仅使用.Net时,我可以使用hub []。当我放入Python时,我需要使用base64序列化为字符串,并且它也可以通过我的PI进行工作。 密切关注我推入GIT的实验室解决方案。 https://github.com/Guille1878/VideoChat
SignalR Hub(默认,不是无服务器的)
namespace ChatHub
{
public interface IVideoChatClient
{
Task DownloadStream(byte[] stream);
}
public class VideoChatHub : Hub<IVideoChatClient>
{
public async Task UploadStream(byte[] stream)
{
await Clients.All.DownloadStream(stream);
}
}
}
视频发送器:(UWP)
while (isStreamingOut)
{
var previewProperties = mediaCapture.VideoDeviceController.GetMediaStreamProperties(MediaStreamType.VideoPreview) as VideoEncodingProperties;
VideoFrame videoFrame = new VideoFrame(BitmapPixelFormat.Bgra8, (int)previewProperties.Width, (int)previewProperties.Height);
Var frame = await mediaCapture.GetPreviewFrameAsync(videoFrame)
if (frame == null)
{
await Task.Delay(delayMilliSeconds);
continue;
}
var memoryRandomAccessStream = new InMemoryRandomAccessStream();
var encoder = await BitmapEncoder.CreateAsync(BitmapEncoder.JpegEncoderId, memoryRandomAccessStream);
encoder.SetSoftwareBitmap(frame.SoftwareBitmap);
encoder.IsThumbnailGenerated = false;
await encoder.FlushAsync();
try
{
var array = new byte[memoryRandomAccessStream.Size];
await memoryRandomAccessStream.ReadAsync(array.AsBuffer(), (uint)memoryRandomAccessStream.Size, InputStreamOptions.None);
if (array.Any())
await connection.InvokeAsync("UploadStream", array);
}
catch (Exception ex)
{
System.Diagnostics.Debug.WriteLine(ex.Message);
}
await Task.Delay(5);
}
视频接收器(UWP)
private async void StreamVideo_Click(object sender, RoutedEventArgs e)
{
isStreamingIn = StreamVideo.IsChecked ?? false;
if (isStreamingIn)
{
hubConnection.On<byte[]>("DownloadStream", (stream) =>
{
_ = this.Dispatcher.RunAsync(CoreDispatcherPriority.Normal, () =>
{
if (isStreamingIn)
StreamedArraysQueue.Enqueue(stream);
});
});
if (hubConnection.State == HubConnectionState.Disconnected)
await hubConnection.StartAsync();
_ = BuildImageFrames();
}
}
}
private async Task BuildImageFrames()
{
while (isStreamingIn)
{
await Task.Delay(5);
StreamedArraysQueue.TryDequeue(out byte[] buffer);
if (!(buffer?.Any() ?? false))
continue;
try
{
var randomAccessStream = new InMemoryRandomAccessStream();
await randomAccessStream.WriteAsync(buffer.AsBuffer());
randomAccessStream.Seek(0);
await randomAccessStream.FlushAsync();
var decoder = await BitmapDecoder.CreateAsync(randomAccessStream);
var softwareBitmap = await decoder.GetSoftwareBitmapAsync();
var imageSource = await ConvertToSoftwareBitmapSource(softwareBitmap);
ImageVideo.Source = imageSource;
}
catch (Exception ex)
{
System.Diagnostics.Debug.WriteLine(ex.Message);
}
}
}
我正在使用“ SignalR Core”