2014-09-29 70 views
0

我目前正在研究Android Wear應用程序,並且正在尋找錄音。我遵循Android開發者網站上的教程,它適用於我的Nexus 7,但不適用於我用於測試的Samsung Gear Live。應用程序一直在崩潰。三星齒輪現場音頻編碼

挖掘一點問題,我可能已經發現這是記錄器工作的兩個參數有問題:OutputFormatAudioEncoder。我嘗試配對並嘗試所有可用的OutputFormatAudioEncoder,但沒有任何運氣。

所以這裏是我的問題:有人遇到同樣的問題?如果是這樣,你是否找到格式/編碼器的正確組合?

我不粘貼我的代碼,因爲它與文檔中的完全相同。這裏是鏈接,如果你想看看:http://developer.android.com/guide/topics/media/audio-capture.html

預先感謝您爲您的答案和你的時間:)

+0

我一直在挖掘更多,並發現一些......有趣/令人不安的信息......當使用ARM_NB或ARM_WB的AAC編碼時,應用程序不會崩潰。相反,當我的'MediaRecorder'上調用'start()'方法時,我有'mediaserver died'錯誤。 – Snow 2014-10-01 08:40:25

+0

你試過了默認編解碼器嗎?我的理解是,Android Wear上沒有壓縮編解碼器,因此您需要在沒有壓縮的情況下捕獲音頻數據,並且它應該適合您,但我沒有測試過,所以我沒有示例。 – 2014-10-17 05:19:01

+0

是的,我也嘗試了默認,但不幸的是它不工作。我會盡量不壓縮,謝謝:) – Snow 2014-10-17 08:06:00

回答

0

根本問題是你不能使用MediaRecorder,即使Android audio capture example does,而是你需要使用AudioRecord類。

此外,我建議將原始數據傳回您的手機,將其組裝成音頻文件,因爲這在可穿戴設備上非常棘手。

欲瞭解更多信息,請參閱this answer瞭解更多信息。

我在下面包括了一個樣本,我開始工作了。

import android.app.Activity; 
import android.content.Intent; 
import android.media.AudioFormat; 
import android.media.AudioRecord; 
import android.media.MediaRecorder; 
import android.os.Bundle; 
import android.speech.RecognizerIntent; 
import android.support.wearable.view.WatchViewStub; 
import android.util.Log; 
import android.widget.TextView; 
import android.view.View; 

import java.util.List; 

public class MainActivity extends Activity { 
    private static final String TAG = MainActivity.class.getName(); 
    private static final int SPEECH_REQUEST_CODE = 1; 

    private static final int RECORDER_SAMPLERATE = 44100; 
    private static final int RECORDER_CHANNELS = AudioFormat.CHANNEL_IN_STEREO; 
    private static final int RECORDER_AUDIO_ENCODING = AudioFormat.ENCODING_PCM_16BIT; 

    private TextView mTextView; 
    private AudioRecord recorder; 
    private int bufferSize = 0; 
    private Thread recordingThread = null; 
    private volatile boolean isRecording; 

    @Override 
    protected void onCreate(Bundle savedInstanceState) { 
     Log.v(TAG, "Creating MainActivity"); 
     super.onCreate(savedInstanceState); 
     setContentView(R.layout.activity_main); 
     final WatchViewStub stub = (WatchViewStub) findViewById(R.id.watch_view_stub); 
     stub.setOnLayoutInflatedListener(new WatchViewStub.OnLayoutInflatedListener() { 
      @Override 
      public void onLayoutInflated(WatchViewStub stub) { 
       mTextView = (TextView) stub.findViewById(R.id.text); 
      } 
     }); 

     bufferSize = 
       AudioRecord.getMinBufferSize(RECORDER_SAMPLERATE, 
         RECORDER_CHANNELS, RECORDER_AUDIO_ENCODING); 
    } 

    public void handleRecordButtonClick(View view) { 
     startAudioCapture(); 
    } 

    public void handleStopButtonClick(View view) { 
     stopAudioCapture(); 
    } 

    private void startAudioCapture() { 
     Log.v(TAG, "Starting audio capture"); 
     recorder = new AudioRecord(MediaRecorder.AudioSource.MIC, 
       RECORDER_SAMPLERATE, RECORDER_CHANNELS, RECORDER_AUDIO_ENCODING,  bufferSize); 
     if (recorder.getState() == AudioRecord.STATE_INITIALIZED) { 
      recorder.startRecording(); 
      isRecording = true; 
      Log.v(TAG, "Successfully started recording"); 

      recordingThread = new Thread(new Runnable() { 

       @Override 
       public void run() { 
        processRawAudioData(); 
       } 
      }, "AudioRecorder Thread"); 

      recordingThread.start(); 
     } else { 
      Log.v(TAG, "Failed to started recording"); 
     } 
    } 

    private void stopAudioCapture() { 
     Log.v(TAG, "Stop audio capture"); 
     recorder.stop(); 
     isRecording = false; 
     recorder.release(); 
    } 

    private void processRawAudioData() { 
     byte data[] = new byte[bufferSize]; 
     int read = 0; 
     while(isRecording) { 
      read = recorder.read(data, 0, bufferSize); 

      if(AudioRecord.ERROR_INVALID_OPERATION != read) { 
       Log.v(TAG, "Successfully read " + data.length + " bytes of audio"); 
      } 
     } 
    } 
}