我正在使用webrtcI在android上构建一个语音聊天应用程序在我的PC上成功建立了与我的模拟器的连接,并在两个方向上流式传输我的声音。我的手机是Android 5.1,我的手机坏了,现在使用的是4.4.2版本。当我尝试使用该应用程序时,它连接良好,但无法在我的语音中流式传输,这就是我在日志中获得的信息:
D/OFFER: v=0
o=- 757416304722047422 2 IN IP4 127.0.0.1
s=-
t=0 0
a=group:BUNDLE audio
a=msid-semantic: WMS LOCAL_MEDIA_STREAM_ID
m=audio 9 RTP/SAVPF 111 103 9 102 0 8 106 105 13 127 126
c=IN IP4 0.0.0.0
a=rtcp:9 IN IP4 0.0.0.0
a=ice-ufrag:FkcTrOiVjoakQJoa
a=ice-pwd:16SQFwIpnPEdmqyYC2PdSDzI
a=fingerprint:sha-1 1F:85:D7:8C:DB:98:72:E7:D2:DE:52:A7:A4:B5:48:85:F1:BC:F3:AC
a=setup:actpass
a=mid:audio
a=extmap:1 urn:ietf:params:rtp-hdrext:ssrc-audio-level
a=extmap:3 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time
a=sendrecv
a=rtcp-mux
a=rtpmap:111 opus/48000/2
a=fmtp:111 minptime=10; useinbandfec=1
a=rtpmap:103 ISAC/16000
a=rtpmap:9 G722/8000
a=rtpmap:102 ILBC/8000
a=rtpmap:0 PCMU/8000
a=rtpmap:8 PCMA/8000
a=rtpmap:106 CN/32000
a=rtpmap:105 CN/16000
a=rtpmap:13 CN/8000
a=rtpmap:127 red/8000
a=rtpmap:126 telephone-event/8000
a=maxptime:60
a=ssrc:954003986 cname:QR3nFlCQG7p7qQNo
a=ssrc:954003986 msid:LOCAL_MEDIA_STREAM_ID AUDIO_TRACK_ID_LOCAL
a=ssrc:954003986 mslabel:LOCAL_MEDIA_STREAM_ID
a=ssrc:954003986 label:AUDIO_TRACK_ID_LOCAL
D/AudioManager: SetCommunicationMode(1)@[tid=19672]
D/HelpersAndroid: Attaching thread to JVM@[tid=19672]
D/WebRtcAudioManager: setCommunicationMode(true)@[name=Thread-303, id=303]
D/WebRtcAudioManager: changing audio mode to: MODE_IN_COMMUNICATION
D/HelpersAndroid: Detaching thread from JVM@[tid=19672]
D/AudioTrackJni: InitPlayout@[tid=19672]
D/HelpersAndroid: Attaching thread to JVM@[tid=19672]
D/WebRtcAudioTrack: InitPlayout(sampleRate=44100, channels=1)
D/WebRtcAudioTrack: byteBuffer.capacity: 882
D/AudioTrackJni: OnCacheDirectBufferAddress
D/AudioTrackJni: direct buffer capacity: 882
D/AudioTrackJni: frames_per_buffer: 441
D/WebRtcAudioTrack: AudioTrack.getMinBufferSize: 4096
D/AudioTrackJni: delay_in_milliseconds: 46
D/HelpersAndroid: Detaching thread from JVM@[tid=19672]
D/AudioTrackJni: StartPlayout@[tid=19672]
D/HelpersAndroid: Attaching thread to JVM@[tid=19672]
D/WebRtcAudioTrack: StartPlayout
I/dalvikvm: Could not find method android.media.AudioTrack.write, referenced from method org.webrtc.voiceengine.WebRtcAudioTrack$AudioTrackThread.run
W/dalvikvm: VFY: unable to resolve virtual method 1101: Landroid/media/AudioTrack;.write (Ljava/nio/ByteBuffer;II)I
D/dalvikvm: VFY: replacing opcode 0x6e at 0x0078
D/HelpersAndroid: Detaching thread from JVM@[tid=19672]
D/WebRtcAudioTrack: AudioTrackThread@[name=AudioTrackJavaThread, id=307]
D/ICE: IceCandidate added :candidate:547260449 1 udp 2122260223 10.0.2.15 36170 typ host generation 0
D/ICE: IceCandidate added :candidate:1847424209 1 tcp 1518280447 10.0.2.15 60568 typ host tcptype passive generation 0
D/AudioTrackJni: StopPlayout@[tid=19672]
D/HelpersAndroid: Attaching thread to JVM@[tid=19672]
D/WebRtcAudioTrack: StopPlayout
D/HelpersAndroid: Detaching thread from JVM@[tid=19672]
D/AudioManager: SetCommunicationMode(0)@[tid=19672]
D/HelpersAndroid: Attaching thread to JVM@[tid=19672]
D/WebRtcAudioManager: setCommunicationMode(false)@[name=Thread-319, id=319]
D/WebRtcAudioManager: restoring audio mode to: MODE_NORMAL
D/HelpersAndroid: Detaching thread from JVM@[tid=19672]
D/AudioManager: SetCommunicationMode(1)@[tid=19672]
D/HelpersAndroid: Attaching thread to JVM@[tid=19672]
D/WebRtcAudioManager: setCommunicationMode(true)@[name=Thread-320, id=320]
D/WebRtcAudioManager: changing audio mode to: MODE_IN_COMMUNICATION
D/HelpersAndroid: Detaching thread from JVM@[tid=19672]
D/AudioTrackJni: InitPlayout@[tid=19672]
D/HelpersAndroid: Attaching thread to JVM@[tid=19672]
D/WebRtcAudioTrack: InitPlayout(sampleRate=44100, channels=1)
D/WebRtcAudioTrack: byteBuffer.capacity: 882
D/AudioTrackJni: OnCacheDirectBufferAddress
D/AudioTrackJni: direct buffer capacity: 882
D/AudioTrackJni: frames_per_buffer: 441
D/WebRtcAudioTrack: AudioTrack.getMinBufferSize: 4096
D/AudioTrackJni: delay_in_milliseconds: 46
D/HelpersAndroid: Detaching thread from JVM@[tid=19672]
D/AudioTrackJni: StartPlayout@[tid=19672]
D/HelpersAndroid: Attaching thread to JVM@[tid=19672]
D/WebRtcAudioTrack: StartPlayout
D/HelpersAndroid: Detaching thread from JVM@[tid=19672]
D/WebRtcAudioTrack: AudioTrackThread@[name=AudioTrackJavaThread, id=324]
D/OFFER: v=0
o=- 2122221720328118009 2 IN IP4 127.0.0.1
s=-
t=0 0
a=group:BUNDLE audio
a=msid-semantic: WMS LOCAL_MEDIA_STREAM_ID
m=audio 9 RTP/SAVPF 111 103 9 102 0 8 106 105 13 127 126
c=IN IP4 0.0.0.0
a=rtcp:9 IN IP4 0.0.0.0
a=ice-ufrag:FoasgPNFAm6dZWo8
a=ice-pwd:lGnZzKSNLhH0vjt0sPw+NIaQ
a=fingerprint:sha-1 45:15:D5:D0:6B:87:81:5D:61:A4:F8:AC:56:EB:E4:2F:1A:59:AA:16
a=setup:actpass
a=mid:audio
a=extmap:1 urn:ietf:params:rtp-hdrext:ssrc-audio-level
a=extmap:3 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time
a=sendrecv
a=rtcp-mux
a=rtpmap:111 opus/48000/2
a=fmtp:111 minptime=10; useinbandfec=1
a=rtpmap:103 ISAC/16000
a=rtpmap:9 G722/8000
a=rtpmap:102 ILBC/8000
a=rtpmap:0 PCMU/8000
a=rtpmap:8 PCMA/8000
a=rtpmap:106 CN/32000
a=rtpmap:105 CN/16000
a=rtpmap:13 CN/8000
a=rtpmap:127 red/8000
a=rtpmap:126 telephone-event/8000
a=maxptime:60
a=ssrc:3345216954 cname:chD7I2bAd/Iwdbk1
a=ssrc:3345216954 msid:LOCAL_MEDIA_STREAM_ID AUDIO_TRACK_ID_LOCAL
a=ssrc:3345216954 mslabel:LOCAL_MEDIA_STREAM_ID
a=ssrc:3345216954 label:AUDIO_TRACK_ID_LOCAL
D/ICE: IceCandidate added :candidate:547260449 1 udp 2122260223 10.0.2.15 41078 typ host generation 1
D/ICE: IceCandidate added :candidate:1847424209 1 tcp 1518280447 10.0.2.15 33099 typ host tcptype passive generation 1.
我在一些论坛中读到这是因为android.media.AudioTrack.write在5.0之前的android版本中不存在我正在使用io.pristine:libjingle:9127 @ aar我该怎么办才能解决这个问题?
这是我的源代码:
package com.example.nyari.webopeer;
import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;
import android.util.Log;
import android.view.View;
import android.widget.ArrayAdapter;
import android.widget.Button;
import android.widget.EditText;
import android.widget.ListView;
import android.widget.TextView;
import android.widget.Toast;
import org.json.JSONException;
import org.json.JSONObject;
import org.webrtc.AudioSource;
import org.webrtc.AudioTrack;
import org.webrtc.DataChannel;
import org.webrtc.IceCandidate;
import org.webrtc.MediaConstraints;
import org.webrtc.MediaStream;
import org.webrtc.PeerConnection;
import org.webrtc.PeerConnectionFactory;
import org.webrtc.SdpObserver;
import org.webrtc.SessionDescription;
import java.util.ArrayList;
import java.util.List;
import io.socket.client.Socket;
import io.socket.emitter.Emitter;
public class MainActivity extends AppCompatActivity implements PeerConnection.Observer,SdpObserver {
static {
System.loadLibrary("louts");
}
public native Socket socketIO();
EditText edit;
TextView hello;
Button button, button2, button3;
ListView listView;
ArrayList<String> list;
ArrayAdapter<String> adapter;
Socket client;
Thread t, voice, play_v;
Emitter.Listener Hello, enterGroup, leaveGroup, message, androidi, connect, candidate, offer, answer;
String MESSAGE;
List<PeerConnection.IceServer> iceServer;
String Type_Signal;
/////SOUND MANAGEMENT
private static String AUDIO_TRACK_ID_LOCAL = "AUDIO_TRACK_ID_LOCAL";
private static String AUDIO_TRACK_ID_REMOTE = "AUDIO_TRACK_ID_REMOTE";
private static String LOCAL_MEDIA_STREAM_ID = "LOCAL_MEDIA_STREAM_ID";
PeerConnectionFactory peerConnectionFactory;
AudioTrack localAudioTrack, remoteAudioTrack;
PeerConnection peerConnection;
MediaConstraints audioConstraints;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
//INITIALISE Peerconnection factory and verifying if it is initialised if not donot continue
boolean peer = PeerConnectionFactory.initializeAndroidGlobals(
getApplicationContext(),
true,//boolean for initializing audio portion of webrtc
false,//boolean for initializing video portiong of webrtc
true,//boolean for hardware acceleration
null//renderEGLContext Can be provided to support HW video decoding to texture and will be used to create a shared EGL context on video decoding thread
);
if (peer = true) {
Toast.makeText(getApplicationContext(), "Webrtc initialised", Toast.LENGTH_LONG).show();
///IF peerconnectionfactory is correctly initialed now create a peerconnectionfactory object
peerConnectionFactory = new PeerConnectionFactory();
Log.i("PCTEST", " factory value " + String.valueOf(peerConnectionFactory));
} else {
Toast.makeText(getApplicationContext(), "Webrtc did not initialised", Toast.LENGTH_LONG).show();
}
//SET mediaconstaints
audioConstraints = new MediaConstraints();
audioConstraints.mandatory.add(new MediaConstraints.KeyValuePair("OfferToReceiveAudio", "true"));
audioConstraints.mandatory.add(new MediaConstraints.KeyValuePair("OfferToReceiveVideo", "false"));
audioConstraints.optional.add(new MediaConstraints.KeyValuePair("DtlsSrtpKeyAgreement", "true"));
//// First we create an AudioSource
AudioSource audioSource = peerConnectionFactory.createAudioSource(audioConstraints);
// Once we have that, we can create our AudioTrack
// Note that AUDIO_TRACK_ID can be any string that uniquely
// identifies that audio track in your application
localAudioTrack = peerConnectionFactory.createAudioTrack(AUDIO_TRACK_ID_LOCAL, audioSource);
// We start out with an empty MediaStream object,
// created with help from our PeerConnectionFactory
// Note that LOCAL_MEDIA_STREAM_ID can be any string
MediaStream mediaStream = peerConnectionFactory.createLocalMediaStream(LOCAL_MEDIA_STREAM_ID);
mediaStream.addTrack(localAudioTrack);
/////////////
//////////////////////
//BELOW E DEAL WITH SIGNALING AND SOCKET.IO
setContentView(R.layout.activity_main);
hello = (TextView) findViewById(R.id.textView);
///////////////
list = new ArrayList<String>();
adapter = new ArrayAdapter<String>(this, android.R.layout.simple_expandable_list_item_2, list);
listView = (ListView) findViewById(R.id.list);
listView.setAdapter(adapter);
////////////////////////
edit = (EditText) findViewById(R.id.edit);
///////////////////////
emit();
button = (Button) findViewById(R.id.button1);
button.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
}
});
//////////////////////
button2 = (Button) findViewById(R.id.button2);
button2.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
}
});
/////////////////////
button3 = (Button) findViewById(R.id.button3);
button3.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
}
});
////INIT WEBRTC PEERCONNECTION
iceServer = new ArrayList<PeerConnection.IceServer>();
iceServer.add(new PeerConnection.IceServer("", "", ""));
peerConnection = peerConnectionFactory.createPeerConnection(iceServer, audioConstraints, this);
peerConnection.addStream(mediaStream);
t = new Thread(new Runnable() {
@Override
public void run() {
client = socketIO();
client.on("Hello", Hello);
client.on("connect", connect);
client.on("candidate", candidate);
client.on("offer", offer);
client.on("answer", answer);
client.on("android", androidi);
/* client.on("leaveGroup",leaveGroup);
client.on("message",message);*/
client.connect();
}
});
t.start();
}
@Override
protected void onDestroy() {
super.onDestroy();
client.disconnect();
t.interrupt();
}
private void emit() {
Hello = new Emitter.Listener() {///HELLO MESSAGE WITH SERVER
@Override
public void call(final Object... args) {
runOnUiThread(new Runnable() {
@Override
public void run() {
final JSONObject obj = (JSONObject) args[0];
try {
MESSAGE = obj.getString("ki");
hello.setText(MESSAGE);
} catch (JSONException e) {
e.printStackTrace();
}
}
});
}
};
//////////
connect = new Emitter.Listener() {///RECEIVE VOICE DATA FROM SERVER
@Override
public void call(final Object... args) {
JSONObject reg = new JSONObject();
try {
reg.put("grp", "form_1");
} catch (JSONException e) {
e.printStackTrace();
}
client.emit("enterGroup", reg);
}
};
///////
//////////
candidate = new Emitter.Listener() {///RECEIVE VOICE DATA FROM SERVER
@Override
public void call(final Object... args) {
JSONObject reg = new JSONObject();
try {
reg.put("grp", "form_1");
} catch (JSONException e) {
e.printStackTrace();
}
}
};
//////////
offer = new Emitter.Listener() {///RECEIVE VOICE DATA FROM SERVER
@Override
public void call(final Object... args) {
Type_Signal="answer";
/* peerConnection.createAnswer(MainActivity.this,audioConstraints);*/
// SessionDescription fi=(SessionDescription)args[0];
final SessionDescription sesso=new SessionDescription(SessionDescription.Type.OFFER,args[0].toString());
peerConnection.setRemoteDescription(MainActivity.this,sesso);
peerConnection.createAnswer(MainActivity.this,audioConstraints);
/*
runOnUiThread(new Runnable(){
@Override
public void run() {
Toast.makeText(getApplicationContext(),sesso.toString(), Toast.LENGTH_LONG).show();
}
});*/
Log.d("OFFER", sesso.description);
}
};
//////////
answer = new Emitter.Listener() {///RECEIVE VOICE DATA FROM SERVER
@Override
public void call(final Object... args) {
final SessionDescription sesso=new SessionDescription(SessionDescription.Type.ANSWER,args[0].toString());
peerConnection.setRemoteDescription(MainActivity.this,sesso);
/* runOnUiThread(new Runnable(){
@Override
public void run() {
Toast.makeText(getApplicationContext(),sesso.toString(), Toast.LENGTH_LONG).show();
}
});*/
Log.d("ANSWER",sesso.toString());
}
};
//////////
androidi = new Emitter.Listener() {///RECEIVE VOICE DATA FROM SERVER
@Override
public void call(final Object... args) {
///////////////USED WHEN OFFER IS CREATED TO EMIT OFFER
/* Type_Signal="offer";
peerConnection.createOffer(MainActivity.this, audioConstraints);*/
}
};
}
@Override
public void onStart() {
super.onStart();
}
@Override
public void onStop() {
super.onStop();
}
@Override
public void onSignalingChange(PeerConnection.SignalingState signalingState) {
}
@Override
public void onIceConnectionChange(PeerConnection.IceConnectionState iceConnectionState) {
}
@Override
public void onIceGatheringChange(PeerConnection.IceGatheringState iceGatheringState) {
}
@Override
public void onIceCandidate(IceCandidate iceCandidate) {
peerConnection.addIceCandidate(iceCandidate);
Log.d("ICE","IceCandidate added :"+iceCandidate.sdp);
}
@Override
public void onAddStream(MediaStream mediaStream) {
if(mediaStream.audioTracks.size()>0){
remoteAudioTrack=mediaStream.audioTracks.get(0);
}
Log.d("STREAMA","Receiving streams");
}
@Override
public void onRemoveStream(MediaStream mediaStream) {
remoteAudioTrack=mediaStream.audioTracks.remove();
}
@Override
public void onDataChannel(DataChannel dataChannel) {
}
@Override
public void onRenegotiationNeeded() {
}
@Override
public void onCreateSuccess(SessionDescription sessionDescription) {
// hello.setText(sessionDescription.description);
peerConnection.setLocalDescription(MainActivity.this,sessionDescription);
/* JSONObject regu = new JSONObject();
try {
regu.put(Type_Signal, sessionDescription);
client.emit(Type_Signal,regu);
} catch (JSONException e) {
e.printStackTrace();
}*/
client.emit(Type_Signal,sessionDescription.description);
// Log.d("ANSWERING",sessionDescription.description);
}
@Override
public void onSetSuccess() {
}
@Override
public void onCreateFailure(final String s) {
runOnUiThread(new Runnable(){
@Override
public void run() {
Toast.makeText(getApplicationContext(), "Failed to create offer because " + s, Toast.LENGTH_LONG).show();
}
});
}
@Override
public void onSetFailure(final String s) {
runOnUiThread(new Runnable(){
@Override
public void run() {
Toast.makeText(getApplicationContext(), "Failed to set offer because " + s, Toast.LENGTH_LONG).show();
}
});
}
}
答案 0 :(得分:0)
我通过使用最新版本的libjingle解决了这个问题'io.pristine:libjingle:9944 @ aar'希望它可以帮助某人