2014-10-19 496 views
1

我正嘗試使用KitKat將實時攝像頭視頻從Ubuntu 12.04 PC流式傳輸到Android設備。到目前爲止,我已經編寫了ffserver配置文件來接收ffm feed並通過rtsp協議進行廣播。我可以用ffplay在同一局域網中的其他計算機上觀看流。使用ffmpeg將低延遲RTSP視頻流傳輸到android

如何在android設備上觀看流?下面的代碼效果很好,當網絡攝像頭圖像流與VLC,但它不與ffmpeg的:

public class MainActivity extends Activity implements MediaPlayer.OnPreparedListener, 
     SurfaceHolder.Callback { 

    final static String RTSP_URL = "rtsp://192.168.1.54:4424/test.sdp"; 

    private MediaPlayer _mediaPlayer; 
    private SurfaceHolder _surfaceHolder; 

    @Override 
    protected void onCreate(Bundle savedInstanceState) { 
     super.onCreate(savedInstanceState); 
     // Set up a full-screen black window. 
     requestWindowFeature(Window.FEATURE_NO_TITLE); 
     Window window = getWindow(); 
     window.setFlags(
       WindowManager.LayoutParams.FLAG_FULLSCREEN, 
       WindowManager.LayoutParams.FLAG_FULLSCREEN); 
     window.setBackgroundDrawableResource(android.R.color.black); 
     getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON); 
     setContentView(R.layout.activity_main); 

     // Configure the view that renders live video. 
     SurfaceView videoView = 
       (SurfaceView) findViewById(R.id.videoView); //where R.id.videoView is a simple SurfaceView element in the layout xml file 
     _surfaceHolder = videoView.getHolder(); 
     _surfaceHolder.addCallback(this); 
     _surfaceHolder.setFixedSize(320, 240); 
    } 
    @Override 
    public void surfaceCreated(SurfaceHolder surfaceHolder) { 
     _mediaPlayer = new MediaPlayer(); 
     _mediaPlayer.setDisplay(_surfaceHolder); 
     Context context = getApplicationContext(); 
     Uri source = Uri.parse(RTSP_URL); 
     try { 
      // Specify the IP camera's URL and auth headers. 
      _mediaPlayer.setDataSource(context, source); 

      // Begin the process of setting up a video stream. 
      _mediaPlayer.setOnPreparedListener(this); 
      _mediaPlayer.prepareAsync(); 
     } 
     catch (Exception e) {} 
    } 
    @Override 
    public void onPrepared(MediaPlayer mediaPlayer) { 
     _mediaPlayer.start(); 
    } 
} 

我ffserver.config文件:

HTTPPort 8090 
RTSPBindAddress 0.0.0.0 
RTSPPort 4424 
MaxBandwidth 10000 
CustomLog - 

<Feed feed1.ffm> 
     File /tmp/feed1.ffm 
     FileMaxSize 20M 
     ACL allow 127.0.0.1 
</Feed> 
<Stream test1.sdp> 
    Feed feed1.ffm 
    Format rtp 
    VideoCodec libx264 
    VideoSize 640x480 
    AVOptionVideo flags +global_header 
    AVOptionVideo me_range 16 
    AVOptionVideo qdiff 4 
    AVOptionVideo qmin 10 
    AVOptionVideo qmax 51 
    Noaudio 
    ACL allow localhost 
     ACL allow 192.168.0.0 192.168.255.255 
</Stream> 

我開始用這個命令流: ffmpeg -f v4l2 -i /dev/video0 -c:v libx264 -b:v 600k http://localhost:8090/feed1.ffm

+0

也許使用Wireshark檢查RTSP級別發生了什麼 - 連接是否打開,是否找到了跟蹤?如果找不到軌道,問題很可能發生在ffserver,否則如果推送數據,可能會出現Android無法處理的格式問題。 – 2014-10-20 08:43:16

+0

我檢查了Android [這裏]支持的格式(http://developer.android.com/guide/appendix/media-formats.html),我使用的是支持的格式。我也確信連接是由ffserver打開和註冊的。控制檯輸出:'Mon Oct 20 17:04:53 2014 192.168.1.55 - - [DESCRIBE]「rtsp://192.168.1.54:4424/test.sdp RTSP/1.0」200 72' – grzebyk 2014-10-20 15:05:37

+0

Android Studio中的logcat顯示以下錯誤MediaPlayer:錯誤(1,-2147483648),這是未知的(描述[這裏](http://stackoverflow.com/questions/11540076/android-mediaplayer-error-1-2147483648)) – grzebyk 2014-10-20 15:50:49

回答

0

這個錯誤很可能是由VLC和FFmpeg的不同編碼參數引起的 - VLC可以使用Android支持的編碼參數,但FFmpeg可以使用不支持的編碼參數(最有可能是AVC配置文件和級別)。嘗試通過FFmpeg命令行選項和ffserver.config強制執行基線或主配置文件和YUV 4:2:0像素格式。