如何在Android上播放webrtc.AudioTrack(无视频)

时间:2016-07-25 12:39:37

标签: android webrtc libjingle

我正在尝试使用原生webrtc SDK(libjingle)来安装android。到目前为止,我可以将流从android发送到web(或其他平台)就好了。我也可以从同伴那里收到MediaStream。 (到onAddStream回调)

我正在进行的项目只需要音频流。没有创建视频曲目也没有发送给任何人。

我的问题是,我如何播放从远程同行那里获得的MediaStream对象?

@Override
public void onAddStream(MediaStream mediaStream) {
    Log.d(TAG, "onAddStream: got remote stream");
    // Need to play the audio ///
}

同样,问题是关于音频。我没有使用视频。 显然所有本地webrtc示例都使用视频轨道,所以我没有运气在网上找到任何文档或示例。

提前致谢!

1 个答案:

答案 0 :(得分:2)

We can get the Remote Audio Track using below code

import org.webrtc.AudioTrack;

@Override
public void onAddStream(final MediaStream stream){
    if(stream.audioTracks.size() > 0) {
        remoteAudioTrack = stream.audioTracks.get(0);
    }
}

Apparently all the native webrtc examples uses video tracks, so i had no luck finding any documentation or examples on the web.

Yes, as an app developer we have to take care only video rendering. If we have received Remote Audio Track, by default it will play in default speaker(ear speaker/loud speaker/wired headset) based proximity settings.

Check below code in AppRTCAudioManager.java to enable/disable speaker

/** Sets the speaker phone mode. */
private void setSpeakerphoneOn(boolean on) {
    boolean wasOn = audioManager.isSpeakerphoneOn();
    if (wasOn == on) {
      return;
    }
    audioManager.setSpeakerphoneOn(on);
}

Reference Source: AppRTCAudioManager.java