I use libstreaming to create a RTSP server on an Android. Then, I use another phone to connect to the server to play the live stream. I hope the server can use its camera and microphone to record a video then play on the client. After connecting, the video can play properly, but there is no sound.
The following is part of my RTSP server's code:
mSession = SessionBuilder.getInstance()
.setSurfaceView(mSurfaceView)
.setPreviewOrientation(90)
.setContext(getApplicationContext())
.setAudioEncoder(SessionBuilder.AUDIO_AAC)
//.setAudioQuality(new AudioQuality(16000, 32000))
.setAudioQuality(new AudioQuality(8000, 16000))
.setVideoEncoder(SessionBuilder.VIDEO_H264)
//.setVideoQuality(new VideoQuality(320, 240, 20, 500000))
.build();
mSession.startPreview(); //camera preview on phone surface
mSession.start();
I searched for this question, some people said I should modify the destination ports in SessionBuilder.java.
I tried to modify it as follow, but it still did not work
if (session.getAudioTrack() != null) {
Log.e("SessionBuilder", "Audio track != null");
AudioStream audio = session.getAudioTrack();
audio.setAudioQuality(mAudioQuality);
audio.setDestinationPorts(5008);
}
Does somebody know the reason for this question?
By the way, I used VLC player on another phone as the client. I use the following line to connect to the server
rtsp:MY_IP:1234?h264=200-20-320-240
Thanks
I traced the source code and found out that the server did not receive the request of the audio stream, only received the request of the video stream.
After setup the connection in RtspServer.java, the received trackID=1.
(trackID=0 means AudioStream && trackID=1 means VideoStream)
I solved this problem by using a different URL:
Thanks