到目前为止,我几乎为IOS App开发了一个Flash Air,我完成了50%。我得到了所有的实时流媒体与RED5媒体服务器一起使用,但是当我发现iPhone摄像头显示90度时,它被发送到死角,所以发送并保存到RED5服务器的流也被记录了90度。我甚至在Stack Overflow上在线阅读了很多文章,这是一个已知的Bug。不确定何时修复。我还尝试了一些ANE Air Native Extensions,它可以在我的Air App中进行相机旋转,这些是DiaDraw和StarlingCamera ANE,但是在网上找不到任何信息教程,所以无法使其工作。
Flash Adobe Air for IOS代码连接到RED5服务器并显示iPhone相机为90度:
import flash.display.DisplayObject;
import flash.display.Sprite;
import flash.display.StageAlign;
import flash.display.StageScaleMode;
import flash.events.ActivityEvent;
import flash.events.MouseEvent;
import flash.media.Camera;
import flash.media.Video;
import flash.events.NetStatusEvent;
import flash.net.NetStream;
import flash.net.NetConnection;
var nc:NetConnection;
var cam:Camera;
var vid:Video;
var nsOut:NetStream;
var nsIn:NetStream;
// support autoOrients
stage.align = StageAlign.TOP_LEFT;
stage.scaleMode = StageScaleMode.NO_SCALE;
cam = Camera.getCamera();
cam.setMode(320, 300, 25);
cam.setQuality(0,100);
if (!cam)
{
txt.text ="No camera is installed.";
}
else
{
nc = new NetConnection();
nc.connect("rtmp://192.168.1.5/RED5Hugt");
nc.addEventListener(NetStatusEvent.NET_STATUS,getStream);
nc.client = this;
//connectCamera();
}
function getStream(e:NetStatusEvent):void
{
connectCamera();
nsIn = new NetStream(nc);
nsOut = new NetStream(nc);
vid.attachNetStream(nsIn);
nsIn.play("tester");
nsOut=new NetStream(nc);
netOut.attachAudio(mic);
nsOut.attachCamera(cam);
nsOut.publish("tester", "live");
// add click event to record button
// add event for stage video render state
}
function connectCamera():void
{
vid = new Video();
vid.width = cam.width;
vid.height = cam.height;
vid.x = 0;
vid.y = 0;
vid.attachCamera(cam);
addChild(vid);
//stage.addEventListener(MouseEvent.CLICK, clickHandler);
}
function clickHandler(e:MouseEvent):void
{
return;
switch (cam.width) {
case 160:
cam.setMode(320, 240, 10);
break;
case 320:
cam.setMode(640, 480, 5);
break;
default:
cam.setMode(160, 120, 15);
break;
}
removeChild(vid);
connectCamera();
}
RED5服务器代码:
package com;
import java.lang.reflect.Array;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Hashtable;
import org.red5.server.adapter.ApplicationAdapter;
import org.red5.server.adapter.MultiThreadedApplicationAdapter;
import org.red5.server.api.IClient;
import org.red5.server.api.IConnection;
import org.red5.server.api.Red5;
import org.red5.server.api.scope.IScope;
import org.red5.server.api.service.IPendingServiceCall;
import org.red5.server.api.service.IPendingServiceCallback;
import org.red5.server.api.service.IServiceCapableConnection;
import org.red5.server.api.service.IServiceHandlerProvider;
import org.red5.server.api.stream.IBroadcastStream;
import org.red5.server.api.stream.IServerStream;
import org.red5.server.api.stream.support.SimplePlayItem;
import org.red5.server.api.stream.support.StreamUtils;
import org.red5.server.stream.ClientBroadcastStream;
import org.red5.server.stream.RecordingListener;
import static java.lang.System.*;
public class Application extends MultiThreadedApplicationAdapter{
//private static final Log log = LogFactory.getLog( Application.class );
private IBroadcastStream serverStream;
@Override
public boolean appStart(IScope scope)
{
if(!super.appStart(scope))
{
return false;
}
else
{
super.appStart(scope);
return true;
}
}
@Override
public void appStop(IScope scope)
{
}
//User connecting/disconnecting to/from application
@Override
public boolean appConnect(IConnection connection, Object[] parameters) {
super.appConnect(connection, parameters);
//connection.getClient().setAttribute("userName", parameters[0]);
return true;
}
@Override
public void appDisconnect(IConnection connection) {
super.appDisconnect(connection);
}
//User joining/leaving scope
@Override
public boolean appJoin(IClient client, IScope scope)
{
super.appJoin(client, scope);
return true;
}
@Override
public void appLeave(IClient client, IScope scope)
{
super.appLeave(client,scope);
}
public boolean connect(IConnection conn, IScope scope, Object[] params) {
super.connect(conn, scope, params);
return true;
}
}
所以然后我决定本地流,我在objective-c中使用了Media和CommLib库并找到了这段代码,我将它粘贴在Pastebin上,因为在粘贴时有太多代码需要编辑:
这是成功连接到另一台PC上运行的RED5 Java Application Adapter的RTMP CLIENT。
当使用Xcode运行RTMP CLIENT时,应用程序正在直播,但每隔几秒音频就会出现故障,我找不到任何可以解决这个问题的东西,也许我需要改变一些代码,但我发现这个RTMP客户端在线编码并修改Stream Name和RTMP Url以指向我在RED5服务器上的应用程序。
为什么音频会出现故障。所以现在我想给WOWZA媒体服务器一个机会。
到目前为止,我下载了Wowza Media Server然后安装了Eclipse。安装Eclipse后,我安装了我在Eclipse中安装的Wowza IDE,但确保在安装Wowza IDE之前先安装Wowza Media Server。
然后我在网上发现了一些在Wowza服务器上运行的服务器代码,新的Wowza服务器在浏览器中运行,我尝试在终端中运行它,但无法让它运行,因为它一直说“操作”无法在终端中运行。
这是我用于服务器的代码:
package com;
import com.wowza.wms.application.*;
import com.wowza.wms.amf.*;
import com.wowza.wms.client.*;
import com.wowza.wms.module.*;
import com.wowza.wms.request.*;
import com.wowza.wms.stream.*;
import com.wowza.wms.rtp.model.*;
import com.wowza.wms.httpstreamer.model.*;
import com.wowza.wms.httpstreamer.cupertinostreaming.httpstreamer.*;
import com.wowza.wms.httpstreamer.smoothstreaming.httpstreamer.*;
public class RED5Hugt extends ModuleBase {
public void doSomething(IClient client, RequestFunction function,
AMFDataList params) {
getLogger().info("doSomething");
sendResult(client, params, "Hello Wowza");
}
public void onAppStart(IApplicationInstance appInstance) {
String fullname = appInstance.getApplication().getName() + "/"
+ appInstance.getName();
getLogger().info("onAppStart: " + fullname);
}
public void onAppStop(IApplicationInstance appInstance) {
String fullname = appInstance.getApplication().getName() + "/"
+ appInstance.getName();
getLogger().info("onAppStop: " + fullname);
}
public void onConnect(IClient client, RequestFunction function,
AMFDataList params) {
getLogger().info("onConnect: " + client.getClientId());
}
public void onConnectAccept(IClient client) {
getLogger().info("onConnectAccept: " + client.getClientId());
}
public void onConnectReject(IClient client) {
getLogger().info("onConnectReject: " + client.getClientId());
}
public void onDisconnect(IClient client) {
getLogger().info("onDisconnect: " + client.getClientId());
}
public void onStreamCreate(IMediaStream stream) {
getLogger().info("onStreamCreate: " + stream.getSrc());
}
public void onStreamDestroy(IMediaStream stream) {
getLogger().info("onStreamDestroy: " + stream.getSrc());
}
}
所以现在我运行XMP项目RTMP CLIENT并发现RTMP CLIENT确实已连接但我没有得到实时流。只是想问一下我需要在哪里更改代码才能让实时流式传输工作?
另外根据我使用Adobe Air for IOS和使用RED5的经验,我应该采取最后一条路线,因为我不确定。我已经研究了很多,我真的想找到知道如何从iPhone相机流式传输的无成本方式。我已尝试将Adobe用于IOS到RED5服务器,并发现有一个已知的Bug,我尝试过Native到RED5并发现了音频故障,所以Wowza是最好的选择吗?
我在Xcode控制台中得到一个无法找到流的输出,这是输出:
2014-07-30 13:33:06.246 RTMPStreamComeback[340:4003] $$$$$$ <MPIMediaStreamEvent> stateChangedEvent: sender = MediaStreamPlayer, 4 = NetStream.Play.StreamNotFound
我已将UpStream名称设置为“tester”,将DownStream名称设置为“tester”
Streaming现在有效我在网上发现了这个教程,发现我需要包含Application.xml并配置文件。这是教程:
https://www.youtube.com/watch?v=XoojcVfdHWg
但是再次像RED5一样,音频似乎出现故障,但音频流不会因音频而中断。
我如何修复它?
这是来自Xcode的追踪:
2014-07-30 15:19:48.296 RTMPStreamComeback[397:60b] connectControl: host = rtmp://192.168.1.4:1935/RED5Hugt
2014-07-30 15:19:49.325 RTMPStreamComeback[397:60b] Video encoding is initialized: bit_rate = 272000, rc_max_rate = 0, rc_min_rate = 0, qmin=2, qmax=31, qcompress=0.500000
2014-07-30 15:19:49.334 RTMPStreamComeback[397:60b] AudioCodec: codecID = 86050, codecType = 42, bitRate = 16000, _sampleBytes = 4
encoder supports the sample formats:
flt,
audio codec best options: sample_rate = 44100
2014-07-30 15:19:49.340 RTMPStreamComeback[397:60b] audio codec context: codec_type = 1, sample_fmt = flt, bit_rate = 16000, sample_rate = 16000, channels = 1, frame_bits = 4, channel_layout = 0, frame_size = 0, buffer_size = 256
2014-07-30 15:19:49.344 RTMPStreamComeback[397:60b] initVideoCapture -> preset AVCaptureSessionPresetLow is supported, orientation = 3
2014-07-30 15:19:49.414 RTMPStreamComeback[397:60b] BroadcastStreamClient STREAM ----> name: tester, type: 2, [socket retainCount] = 3
2014-07-30 15:19:49.789 RTMPStreamComeback[397:60b] $$$$$$ <MPIMediaStreamEvent> stateChangedEvent: sender = BroadcastStreamClient, 1 = RTMP.Client.isConnected
2014-07-30 15:19:49.831 RTMPStreamComeback[397:3f03] $$$$$$ <MPIMediaStreamEvent> stateChangedEvent: sender = BroadcastStreamClient, 2 = RTMP.Client.Stream.isCreated
2014-07-30 15:19:50.585 RTMPStreamComeback[397:3f03] $$$$$$ <MPIMediaStreamEvent> stateChangedEvent: sender = BroadcastStreamClient, 3 = NetStream.Publish.Start
[flv @ 0x1883d000] Error, Invalid timestamp=0, last=0
[flv @ 0x1883d000] Error, Invalid timestamp=0, last=0
[flv @ 0x1883d000] Error, Invalid timestamp=0, last=0
2014-07-30 15:20:01.677 RTMPStreamComeback[397:60b] publishControl: stream = slavav
2014-07-30 15:20:01.680 RTMPStreamComeback[397:60b] NellyMoserDecoder -> audio codec context: codec_type = 1, sample_fmt = flt, bit_rate = 16000, sample_rate = 16000, channels = 1, frame_bits = 4, channel_layout = 0, frame_size = 0
2014-07-30 15:20:01.681 RTMPStreamComeback[397:60b] Set Player's Framework -> 'AudioUnit'
2014-07-30 15:20:01.685 RTMPStreamComeback[397:60b] VideoStream decoding is initialized: context->pix_fmt = 0, width = 0, height = 0
2014-07-30 15:20:01.689 RTMPStreamComeback[397:60b] $$$$$$ <MPIMediaStreamEvent> stateChangedEvent: sender = MediaStreamPlayer, 1 = RTMP.Client.isConnected
2014-07-30 15:20:01.712 RTMPStreamComeback[397:3f03] $$$$$$ <MPIMediaStreamEvent> stateChangedEvent: sender = MediaStreamPlayer, 2 = RTMP.Client.Stream.isCreated
2014-07-30 15:20:01.758 RTMPStreamComeback[397:3f03] $$$$$$ <MPIMediaStreamEvent> stateChangedEvent: sender = MediaStreamPlayer, 3 = NetStream.Play.Start
[flv @ 0x1808c600] Bad picture start code
[flv @ 0x1808c600] header damaged
2014-07-30 15:20:01.764 RTMPStreamComeback[397:3f03] VideoStream -> decodeFrame: (ERROR) got_packet = 0, processed_size = -1
[swscaler @ 0x2e38000] No accelerated colorspace conversion found from yuv420p to bgra.
2014-07-30 15:20:01.775 RTMPStreamComeback[397:7a03] MPAudioUnitEngine -> nextFrame: < NO PCM - WHITE NOISE > timestamp = 11192, dropWhiteNoise = 0
2014-07-30 15:20:01.799 RTMPStreamComeback[397:7a03] MPAudioUnitEngine -> nextFrame: < NO PCM - WHITE NOISE > timestamp = 11215, dropWhiteNoise = 0
2014-07-30 15:20:01.821 RTMPStreamComeback[397:7a03] MPAudioUnitEngine -> nextFrame: < NO PCM - WHITE NOISE > timestamp = 11238, dropWhiteNoise = 0
2014-07-30 15:20:01.845 RTMPStreamComeback[397:7a03] MPAudioUnitEngine -> nextFrame: < NO PCM - WHITE NOISE > timestamp = 11261, dropWhiteNoise = 0
2014-07-30 15:20:01.867 RTMPStreamComeback[397:7a03] MPAudioUnitEngine -> nextFrame: < NO PCM - WHITE NOISE > timestamp = 11284, dropWhiteNoise = 0
[flv @ 0x1808c600] Bad picture start code
[flv @ 0x1808c600] header damaged
2014-07-30 15:20:01.888 RTMPStreamComeback[397:3f03] VideoStream -> decodeFrame: (ERROR) got_packet = 0, processed_size = -1
2014-07-30 15:23:31.088 RTMPStreamComeback[397:3f03] MPAudioUnitEngine -> dropPcm: *** DROPPED = 320, dropWhiteNoise = 8600, pcm.remaining = 16036
2014-07-30 15:23:31.090 RTMPStreamComeback[397:3f03] MPAudioUnitEngine -> dropPcm: *** DROPPED = 1024, dropWhiteNoise = 7576, pcm.remaining = 16036
2014-07-30 15:23:31.093 RTMPStreamComeback[397:3f03] MPAudioUnitEngine -> dropPcm: *** DROPPED = 2048, dropWhiteNoise = 5528, pcm.remaining = 16036
2014-07-30 15:23:31.164 RTMPStreamComeback[397:3f03] MPAudioUnitEngine -> dropPcm: *** DROPPED = 640, dropWhiteNoise = 4888, pcm.remaining = 16056
2014-07-30 15:23:31.219 RTMPStreamComeback[397:3f03] MPAudioUnitEngine -> dropPcm: *** DROPPED = 128, dropWhiteNoise = 4760, pcm.remaining = 16028
2014-07-30 15:23:31.298 RTMPStreamComeback[397:3f03] MPAudioUnitEngine -> dropPcm: *** DROPPED = 192, dropWhiteNoise = 4568, pcm.remaining = 16036
2014-07-30 15:23:31.604 RTMPStreamComeback[397:3f03] MPAudioUnitEngine -> dropPcm: *** DROPPED = 128, dropWhiteNoise = 4440, pcm.remaining = 16048
2014-07-30 15:23:31.740 RTMPStreamComeback[397:3f03] MPAudioUnitEngine -> dropPcm: *** DROPPED = 320, dropWhiteNoise = 4120, pcm.remaining = 16024
2014-07-30 15:23:32.296 RTMPStreamComeback[397:3f03] MPAudioUnitEngine -> dropPcm: *** DROPPED = 192, dropWhiteNoise = 3928, pcm.remaining = 16008
2014-07-30 15:24:47.575 RTMPStreamComeback[397:3f03] MPAudioUnitEngine -> dropPcm: *** DROPPED = 64, dropWhiteNoise = 3864, pcm.remaining = 16000
这些是音频和视频格式的默认设置: 默认流媒体设置如下所示:
Audio:
codec - Nelly Mozer 16KHz, mono
bitrate - 128000
Video:
codec - H.263 (Sorenson)
bitrate - 200000
resolution - 192x144px
fps = 25
intra frame - 10
还建议:
Currently the library does not provide an instrument to control the stream quality. This is planned for a future release.
所以我不认为这可以改变音频流格式