2011-04-06 95 views
9

我使用VLC作爲RTSP服務器 此代碼實現RTSP在Android上的MediaPlayer:RTP Android上的MediaPlayer

# vlc -vvv /home/marco/Videos/pippo.mp4 --sout 
#rtp{dst=192.168.100.246,port=6024-6025,sdp=rtsp://192.168.100.243:8080/test.sdp} 

,並在Android項目:


Uri videoUri = Uri.parse("rtsp://192.168.100.242:8080/test.sdp"); 
videoView.setVideoURI(videoUri); 
videoView.start(); 

這工作正常,但如果我也想播放直播流RTP,所以我 複製sdp文件到SD卡(/mnt/sdcard/test.sdp),並設置 VLC:

# vlc -vvv /home/marco/Videos/pippo.mp4 --sout 
#rtp{dst=192.168.100.249,port=6024-6025} 

我試圖播放流RTP設置SDP文件 本地的路徑:


Uri videoUri = Uri.parse("/mnt/sdcard/test.sdp"); 
videoView.setVideoURI(videoUri); 
videoView.start(); 

但是我得到一個錯誤:


D/MediaPlayer(9616): Couldn't open file on client side, trying server side 
W/MediaPlayer(9616): info/warning (1, 26) 
I/MediaPlayer(9616): Info (1,26) 
E/PlayerDriver( 76): Command PLAYER_INIT completed with an error or info PVMFFailure 
E/MediaPlayer(9616): error (1, -1) 
E/MediaPlayer(9616): Error (1,-1) 
D/VideoView(9616): Error: 1,-1 

有沒有人知道問題在哪裏?我是我錯了,或者不可能 在MediaPlayer上播放RTP? Cheers Giorgio

回答

-1

不幸的是,用Android MediaPlayer播放RTP流是不可能的。

此問題的解決方案包括使用ffmpeg解碼RTP流。有關如何爲Android編譯ffmpeg的教程可以在Web上找到。

+0

你能告訴我們一些鏈接? – 2011-11-28 16:34:01

+1

我在網上搜索了很多,有幾個教程和文章解釋瞭如何構建它。有些工作,有些則不。你可能想看看這個帖子http://www.roman10.net/?p=389,甚至看看RockPlayer的人做了什麼http://www.rockplayer.com/tech_en.html。第一個描述依賴於RockPlayer傢伙的內置腳本。 – ladi 2011-11-29 08:40:27

2

我對你有部分解決方案。

我目前正在研究一個Ra & D項目,涉及從服務器到Android客戶端的媒體RTP流。

通過這項工作,我爲我自己的圖書館貢獻了一份名爲smpte2022lib的文件,您可以在這裏找到: http://sourceforge.net/projects/smpte-2022lib/

與這些庫的幫助下(Java實現是目前最好的一個),您可能能夠解析從專業的流媒體設備公司未來RTP組播流,VLC RTP會話,...

我已經用它測試成功來自捕獲的專用RTP流與SMPTE-2022 2D-FEC或用VLC產生的簡單流。

不幸的是,我不能把代碼片段放在這裏,因爲使用它的項目實際上是在版權下,但我確保你可以簡單地通過解析UDP流幫助RtpPacket構造函數來使用它。

如果數據包是有效的RTP數據包(字節),它們將被解碼。

在這一刻,我將調用RtpPacket的構造函數包裝爲一個線程,該線程實際上將解碼的有效載荷存儲爲媒體文件。然後我會用這個文件作爲參數調用VideoView。

交叉手指;-)

親切的問候,

戴維·費希爾

0

可能在Android中使用(不MediaPlayer的,但其他的東西,進一步下跌的堆棧),但你真的想不追求RTSP/RTP當媒體生態系統的其餘部分不?

IMO - 在HTML5/WebRTC的保護下有更好的媒體/流方法。就像看看'Ondello'在做什麼。

這就是說,這裏是一些使用'netty'和'efflux'的android/RTSP/SDP/RTP的舊項目代碼。它將就SDP文件提供商的部分「會話」進行談判。不能記得它是否會實際播放Youtube/RTSP stuff的音頻部分,但這正是我當時的目標。 (我認爲它的工作使用AMR-NB編解碼器,但是,有噸的問題,我放棄了RTSP android上像一個壞習慣!)

on Git ....

 @Override 
     public void mediaDescriptor(Client client, String descriptor) 
     { 
      // searches for control: session and media arguments. 
      final String target = "control:"; 
      Log.d(TAG, "Session Descriptor\n" + descriptor); 
      int position = -1; 
      while((position = descriptor.indexOf(target)) > -1) 
      { 
       descriptor = descriptor.substring(position + target.length()); 
       resourceList.add(descriptor.substring(0, descriptor.indexOf('\r'))); 
      } 
     } 
     private int nextPort() 
     { 
      return (port += 2) - 2; 
     }  


     private void getRTPStream(TransportHeader transport){ 

      String[] words; 
      // only want 2000 part of 'client_port=2000-2001' in the Transport header in the response 

      words = transport.getParameter("client_port").substring(transport.getParameter("client_port").indexOf("=") +1).split("-"); 
      port_lc = Integer.parseInt(words[0]); 

      words = transport.getParameter("server_port").substring(transport.getParameter("server_port").indexOf("=") +1).split("-"); 
      port_rm = Integer.parseInt(words[0]); 

      source = transport.getParameter("source").substring(transport.getParameter("source").indexOf("=") +1);   
      ssrc = transport.getParameter("ssrc").substring(transport.getParameter("ssrc").indexOf("=") +1); 
      // assume dynamic Packet type = RTP , 99 
      getRTPStream(session, source, port_lc, port_rm, 99); 
      //getRTPStream("sessiona", source, port_lc, port_rm, 99); 
      Log.d(TAG, "raw parms " +port_lc +" " +port_rm +" " +source); 
//   String[] words = session.split(";"); 
     Log.d(TAG, "session: " +session); 
     Log.d(TAG, "transport: " +transport.getParameter("client_port") 
       +" " +transport.getParameter("server_port") +" " +transport.getParameter("source") 
       +" " +transport.getParameter("ssrc")); 

     } 

     private void getRTPStream(String session, String source, int portl, int portr, int payloadFormat){ 
      // what do u do with ssrc? 
      InetAddress addr; 
      try { 
       addr = InetAddress.getLocalHost(); 
       // Get IP Address 
//    LAN_IP_ADDR = addr.getHostAddress(); 
       LAN_IP_ADDR = "192.168.1.125"; 
       Log.d(TAG, "using client IP addr " +LAN_IP_ADDR); 

      } catch (UnknownHostException e1) { 
       // TODO Auto-generated catch block 
       e1.printStackTrace(); 
      } 


      final CountDownLatch latch = new CountDownLatch(2); 

      RtpParticipant local1 = RtpParticipant.createReceiver(new RtpParticipantInfo(1), LAN_IP_ADDR, portl, portl+=1); 
    //  RtpParticipant local1 = RtpParticipant.createReceiver(new RtpParticipantInfo(1), "127.0.0.1", portl, portl+=1); 
      RtpParticipant remote1 = RtpParticipant.createReceiver(new RtpParticipantInfo(2), source, portr, portr+=1); 


      remote1.getInfo().setSsrc(Long.parseLong(ssrc, 16)); 
      session1 = new SingleParticipantSession(session, payloadFormat, local1, remote1); 

      Log.d(TAG, "remote ssrc " +session1.getRemoteParticipant().getInfo().getSsrc()); 

      session1.init(); 

      session1.addDataListener(new RtpSessionDataListener() { 
       @Override 
       public void dataPacketReceived(RtpSession session, RtpParticipantInfo participant, DataPacket packet) { 
    //    System.err.println("Session 1 received packet: " + packet + "(session: " + session.getId() + ")"); 
        //TODO close the file, flush the buffer 
//     if (_sink != null) _sink.getPackByte(packet); 
        getPackByte(packet); 

    //    System.err.println("Ssn 1 packet seqn: typ: datasz " +packet.getSequenceNumber() + " " +packet.getPayloadType() +" " +packet.getDataSize()); 
    //    System.err.println("Ssn 1 packet sessn: typ: datasz " + session.getId() + " " +packet.getPayloadType() +" " +packet.getDataSize()); 
//     latch.countDown(); 
       } 

      }); 
    //  DataPacket packet = new DataPacket(); 
     //  packet.setData(new byte[]{0x45, 0x45, 0x45, 0x45}); 
    //  packet.setSequenceNumber(1); 
    //  session1.sendDataPacket(packet); 


//  try { 
     //  latch.await(2000, TimeUnit.MILLISECONDS); 
    //  } catch (Exception e) { 
    //   fail("Exception caught: " + e.getClass().getSimpleName() + " - " + e.getMessage()); 

//  } 
     } 
//TODO below should collaborate with the audioTrack object and should write to the AT buffr 
     // audioTrack write was blocking forever 

    public void getPackByte(DataPacket packet) { 
      //TODO this is getting called but not sure why only one time 
      // or whether it is stalling in mid-exec?? 

      //TODO on firstPacket write bytes and start audioTrack 
      // AMR-nb frames at 12.2 KB or format type 7 frames are handled . 
      // after the normal header, the getDataArray contains extra 10 bytes of dynamic header that are bypassed by 'limit' 


      // real value for the frame separator comes in the input stream at position 1 in the data array 
      // returned by 

//   int newFrameSep = 0x3c; 
      // bytes avail = packet.getDataSize() - limit; 

//   byte[] lbuf = new byte[packet.getDataSize()]; 
//   if (packet.getDataSize() > 0) 
//    lbuf = packet.getDataAsArray(); 
      //first frame includes the 1 byte frame header whose value should be used 
      // to write subsequent frame separators 
      Log.d(TAG, "getPackByt start and play"); 

      if(!started){ 
       Log.d(TAG, " PLAY audioTrak"); 
       track.play(); 
       started = true; 
      } 

//   track.write(packet.getDataAsArray(), limit, (packet.getDataSize() - limit)); 
      track.write(packet.getDataAsArray(), 0, packet.getDataSize()); 
      Log.d(TAG, "getPackByt aft write"); 

//   if(!started && nBytesRead > minBufferSize){ 
    //   Log.d(TAG, " PLAY audioTrak"); 
     //  track.play(); 
     // started = true;} 
      nBytesRead += packet.getDataSize(); 
      if (nBytesRead % 500 < 375) Log.d(TAG, " getPackByte plus 5K received"); 
     }  
    }