2015-04-17 48 views
3

使用AudioRecord,我試圖編寫一個測試應用程序來記錄幾秒鐘的音頻以顯示在屏幕上。但是,我似乎得到了如下所示的零值區域的重複模式。我不確定這是否是正常行爲或我的代碼中的錯誤。AudioRecord在Android 5.01上產生0的差距

zeros

MainActivity.java

public class MainActivity extends Activity implements OnClickListener 
{ 
    private static final int SAMPLE_RATE = 44100; 
    private Button recordButton, playButton; 
    private String filePath; 
    private boolean recording; 
    private AudioRecord record; 
    private short[] data; 
    private TestView testView; 

    @Override 
    protected void onCreate(Bundle savedInstanceState) 
    { 
     super.onCreate(savedInstanceState); 
     setContentView(R.layout.activity_main); 

     Button recordButton = (Button) this.findViewById(R.id.recordButton); 
     recordButton.setOnClickListener(this); 

     Button playButton = (Button)findViewById(R.id.playButton); 
     playButton.setOnClickListener(this); 

     FrameLayout frame = (FrameLayout)findViewById(R.id.myFrame); 
     frame.addView(testView = new TestView(this)); 
    } 

    @Override 
    public void onClick(View v) 
    { 
     if(v.getId() == R.id.recordButton) 
     { 
      if(!recording) 
      { 
       int bufferSize = AudioRecord.getMinBufferSize( SAMPLE_RATE, 
                   AudioFormat.CHANNEL_IN_MONO, 
                   AudioFormat.ENCODING_PCM_16BIT); 

       record = new AudioRecord( MediaRecorder.AudioSource.MIC, 
              SAMPLE_RATE, 
              AudioFormat.CHANNEL_IN_MONO, 
              AudioFormat.ENCODING_PCM_16BIT, 
              bufferSize * 2); 

       data = new short[10 * SAMPLE_RATE]; // Records up to 10 seconds 

       new Thread() 
       { 
        @Override 
        public void run() 
        { 
         recordAudio(); 
        } 

       }.start(); 

       recording = true; 

       Toast.makeText(this, "recording...", Toast.LENGTH_SHORT).show(); 
      } 
      else 
      { 
       recording = false; 
       Toast.makeText(this, "finished", Toast.LENGTH_SHORT).show(); 
      } 
     } 
     else if(v.getId() == R.id.playButton) 
     { 
      testView.invalidate(); 
      Toast.makeText(this, "play/pause", Toast.LENGTH_SHORT).show(); 
     } 
    } 

    void recordAudio() 
    { 
     record.startRecording(); 
     int index = 0; 
     while(recording) 
     { 
      try { 
       Thread.sleep(50); 
      } catch (InterruptedException e) { 
       // TODO Auto-generated catch block 
       e.printStackTrace(); 
      } 
      int result = record.read(data, index, SAMPLE_RATE); // read 1 second at a time 
      if(result == AudioRecord.ERROR_INVALID_OPERATION || result == AudioRecord.ERROR_BAD_VALUE) 
      { 
       App.d("SOME SORT OF RECORDING ERROR MATE"); 
       return; 
      } 
      else 
      { 
       index += result; // increment by number of bytes read 
       App.d("read: "+result); 
      } 
     } 
     record.stop(); 
     data = Arrays.copyOf(data, index); 

     testView.setData(data); 
    } 

    @Override 
    protected void onPause() 
    { 

     super.onPause(); 
    } 
} 

TestView.java

public class TestView extends View 
{ 
    private short[] data; 
    Paint paint = new Paint(); 
    Path path = new Path(); 
    float min, max; 

    public TestView(Context context) 
    { 
     super(context); 

     paint.setColor(Color.BLACK); 
     paint.setStrokeWidth(1); 
     paint.setStyle(Style.FILL_AND_STROKE); 
    } 

    void setData(short[] data) 
    { 
     min = Short.MAX_VALUE; 
     max = Short.MIN_VALUE; 
     this.data = data; 
     for(int i = 0; i < data.length; i++) 
     { 
      if(data[i] < min) 
       min = data[i]; 

      if(data[i] > max) 
       max = data[i]; 
     } 
    } 

    @Override 
    protected void onDraw(Canvas canvas) 
    { 
     canvas.drawRGB(255, 255, 255); 
     if(data != null) 
     { 
      float interval = (float)this.getWidth()/data.length; 
      for(int i = 0; i < data.length; i+=10) 
       canvas.drawCircle(i*interval,(data[i]-min)/(max - min)*this.getHeight(),5 ,paint); 

     } 
     super.onDraw(canvas); 
    } 
} 
+0

注意:如果min == max,你的onDraw會有問題。 –

+0

我開始認爲read方法返回的是字節數而不是短褲。另外,每個16位採樣是否放入一個短路中,或者是否將一個短路中的8位和下一個中的8位放在一起? –

回答

2

您的導航欄圖標使它看起來像你在Android上5大概運行,並沒有在Android 5.0發佈一個錯誤,這可能會導致精確你的問題看到。

對L短片進行錄製會給L預覽帶來錯誤的返回值,並且在修復過程中大幅修改代碼時,它們錯誤地將5.0版本中的偏移參數加倍。你的代碼按照它在每次調用中讀取的(正確的)數量增加索引,但是音頻內部的指針數學錯誤會使你傳遞的偏移量增加一倍,這意味着每個記錄週期結束後會有一段相等的未寫入數據,緩衝,你看到那些零的差距。

問題被報道http://code.google.com/p/android/issues/detail?id=80866

當時去年秋天被拒絕提交了一個補丁,因爲他們說,他們已經處理了它的內部。查看AOSP 5.1的git歷史記錄,這似乎是11月13日的內部提交283a9d9e1,當我在當月晚些時候遇到它時尚未公開。雖然我還沒有在5.1上試過這個,但它似乎應該修復它,所以很可能它在5.0-5.02之間被破解(並且在L預覽中以不同的方式),但在4.4和更早的版本中也能正確運行與5.1和更高版本一樣。

最簡單的解決方法是在斷開和不間斷髮布版本中保持一致的行爲,以避免在錄製短片時傳遞非零偏移量 - 這就是我在遇到問題時如何修復程序的原因。一個更復雜的想法是試圖弄清楚你是否在一個破碎的版本,如果這樣減半傳遞的參數。一種方法是檢測設備版本,但可以想象一些供應商或定製的ROM 5.0版本可能已經被修補,所以你可以進一步做一個測試偏移量的短記錄到一個歸零緩衝區,然後掃描它到查看非零數據實際開始的位置。

+0

我一直非常不幸運行到與Android的錯誤。謝謝,我正在運行5.01。 –

0

我無法檢查,現在你的代碼,但我可以爲您提供一些示例代碼,你可以測試:

private static int channel_config = AudioFormat.CHANNEL_IN_MONO; 
private static int format = AudioFormat.ENCODING_PCM_16BIT; 
private static int Fs = 16000; 
private static int minBufferSize; 
private boolean isRecording; 
private boolean isProcessing; 
private boolean isNewAudioFragment; 

private final static int bytesPerSample = 2; // As it is 16bit PCM 
private final double amplification = 1.0; // choose a number as you like 
private static int frameLength = 512; // number of samples per frame => 32[ms] @Fs = 16[KHz] 
private static int windowLength = 16; // number of frames per window => 512[ms] @Fs = 16[KHz] 
private static int maxBufferedWindows = 8; // number of buffered windows => 4096 [ms] @Fs = 16[KHz] 

private static int bufferSize = frameLength*bytesPerSample; 
private static double[] hannWindow = new double[frameLength*bytesPerSample]; 

private Queue<byte[]> queue = new LinkedList<byte[]>(); 
private Semaphore semaphoreProcess = new Semaphore(0, true); 

private RecordSignal recordSignalThread; 
private ProcessSignal processSignalThread; 

public static class RecorderSingleton { 
    public static RecorderSingleton instance = new RecorderSingleton(); 
    private AudioRecord recordInstance = null; 

    private RecorderSingleton() { 
     minBufferSize = AudioRecord.getMinBufferSize(Fs, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT); 
     while(minBufferSize>bufferSize) { 
      bufferSize = bufferSize*2; 
     } 
    } 

    public boolean init() { 
     recordInstance = new AudioRecord(MediaRecorder.AudioSource.MIC, Fs, channel_config, format, bufferSize); 
     if (recordInstance.getState() != AudioRecord.STATE_INITIALIZED) { 
      Log.d("audiotestActivity", "Fail to initialize AudioRecord object"); 
      Log.d("audiotestActivity", "AudioRecord.getState()=" + recordInstance.getState()); 
     } 
     if (recordInstance.getState() == AudioRecord.STATE_UNINITIALIZED) { 
      return false; 
     } 
     return true; 
    } 

    public int getBufferSize() {return bufferSize;} 

    public boolean start() { 
     if (recordInstance != null && recordInstance.getState() != AudioRecord.STATE_UNINITIALIZED) { 
      if (recordInstance.getRecordingState() != AudioRecord.RECORDSTATE_STOPPED) { 
       recordInstance.stop(); 
      } 
      recordInstance.release(); 
     } 
     if (!init()) { 
      return false; 
     } 
     recordInstance.startRecording(); 
     return true; 
    } 
    public int read(byte[] audioBuffer) { 
     if (recordInstance == null) { 
      return AudioRecord.ERROR_INVALID_OPERATION; 
     } 
     int ret = recordInstance.read(audioBuffer, 0, bufferSize); 
     return ret; 
    } 
    public void stop() { 
     if (recordInstance == null) { 
      return; 
     } 
     if(recordInstance.getState()==AudioRecord.STATE_UNINITIALIZED) { 
      Log.d("AudioTest", "instance uninitialized"); 
      return; 
     } 
     if(recordInstance.getState()==AudioRecord.STATE_INITIALIZED) { 
      recordInstance.stop(); 
      recordInstance.release(); 
     } 
    } 
} 

public class RecordSignal implements Runnable { 
    private boolean cancelled = false; 
    public void run() { 
     Looper.prepare(); 
     // We're important...android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO); 
     int bufferRead = 0; 
     byte[] inAudioBuffer; 
     if (!RecorderSingleton.instance.start()) { 
      return; 
     } 
     try { 
      Log.d("audiotestActivity", "Recorder Started"); 
      while(isRecording) { 
       inAudioBuffer = null; 
       inAudioBuffer = new byte[bufferSize]; 
       bufferRead = RecorderSingleton.instance.read(inAudioBuffer); 
       if (bufferRead == AudioRecord.ERROR_INVALID_OPERATION) { 
        throw new IllegalStateException("read() returned AudioRecord.ERROR_INVALID_OPERATION"); 
       } else if (bufferRead == AudioRecord.ERROR_BAD_VALUE) { 
        throw new IllegalStateException("read() returned AudioRecord.ERROR_BAD_VALUE"); 
       } 
       queue.add(inAudioBuffer); 
       semaphoreProcess.release(); 
      } 
     } 
     finally { 
      // Close resources... 
      stop(); 
     } 
     Looper.loop(); 
    } 
    public void stop() { 
     RecorderSingleton.instance.stop(); 
    } 
    public void cancel() { 
     setCancelled(true); 
    } 
    public boolean isCancelled() { 
     return cancelled; 
    } 
    public void setCancelled(boolean cancelled) { 
     this.cancelled = cancelled; 
    } 
} 

public class ProcessSignal implements Runnable { 
    public void run() { 
     Looper.prepare(); 
//android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_DEFAULT); 
     while(isProcessing) { 
      try { 
       semaphoreProcess.acquire(); 
       byte[] outAudioBuffer = new byte[frameLength*bytesPerSample*(bufferSize/(frameLength*bytesPerSample))]; 
       outAudioBuffer = queue.element(); 
       if(queue.size()>0) { 
        // do something, process your samples 
       } 
       queue.poll(); 
      } 
      catch (InterruptedException e) { 
       e.printStackTrace(); 
      } 
     } 
     Looper.loop(); 
    } 
} 

,並開始與簡單地停止:

public void startAudioTest() { 
    if(recordSignalThread!=null) { 
     recordSignalThread.stop(); 
     recordSignalThread.cancel(); 
     recordSignalThread = null; 
    } 
    if(processSignalThread!=null) { 
     processSignalThread = null; 
    } 
    recordSignalThread = new RecordSignal(); 
    processSignalThread = new ProcessSignal(); 
    new Thread(recordSignalThread).start(); 
    new Thread(processSignalThread).start(); 
    isRecording = true; 
    isProcessing = true; 
} 

public void stopAudioTest() { 
    isRecording = false; 
    isProcessing = false; 
    if(processSignalThread!=null) { 
     processSignalThread = null; 
    } 
    if(recordSignalThread!=null) { 
     recordSignalThread.cancel(); 
     recordSignalThread = null; 
    } 
} 
0

如接受的答案中所建議的,不要將偏移量的一半傳遞給讀函數。偏移量是一個整數,可能是不平衡的數字。這會導致音頻質量差,並會與5.0.1以外的其他Android版本不兼容。和5.0.2。我使用了以下解決方法,它適用於所有Android版本。我改變:

short[] buffer = new short[frame_size*(frame_rate)]; 
num = record.read(buffer, offset, frame_size); 

short[] buffer = new short[frame_size*(frame_rate)]; 
short[] buffer_bugfix = new short[frame_size]; 
num = record.read(buffer_bugfix, 0, frame_size); 
System.arraycopy(buffer_bugfix, 0, buffer, offset, frame_size); 

在口頭上,而不是讓讀功能將數據複製到大緩衝區的偏移位置,我讓讀功能中的數據複製到更小的緩衝區然後我手動將這些數據插入到大緩衝區的偏移位置。