2017-08-06 284 views
1

我正在使用NAudio並嘗試使用WasapiLoopbackCapture錄製正在播放的內容。我的問題是我需要記錄的數據爲PCM 16bit 44100khz單聲道。 對於我構建這樣的:NAudio Wasapi錄製和轉換

using System; 
using System.Diagnostics; 

using NAudio.Wave; 
using NAudio.CoreAudioApi; 

namespace soundtest 
{ 
    class Program { 

     static void Main(string[] args) { 
      try { 
       var deviceToRecord = (new MMDeviceEnumerator().EnumerateAudioEndPoints(DataFlow.All, DeviceState.Active))[0]; 

       var recorder = new CustomWasapiLoopbackCapture(deviceToRecord, false, 1000/5); 
       recorder.ShareMode = AudioClientShareMode.Shared; 
       recorder.DataAvailable += recorderDataAvailable; 

       var inprov = new WaveInProvider(recorder); 
       var fto16prov = new WaveFloatTo16Provider(inprov); 
       var stomprov = new StereoToMonoProvider16(fto16prov); 

       Console.WriteLine("Press something to stop recording."); 
       recorder.StartRecording(); 
       Console.ReadKey(); 
       recorder.StopRecording(); 

      } catch (Exception e) { 
       Console.WriteLine("!!! EXCEPTION !!!" + 
        "\nMessage:\n " + e.Message + 
        "\nSource:\n " + e.Source + 
        "\nStack:\n" + e.StackTrace); 
      } 

      Console.WriteLine("Press something to close."); 
      Console.ReadKey(); 
     } 

     static void recorderDataAvailable(object sender, WaveInEventArgs args) { 
      // how do I access PCM 16bit here? 
      // It's not args.Buffer, or am I wrong? 
      // additional calculation is done here with the PCM data 
     } 
    } 


    class CustomWasapiLoopbackCapture : WasapiCapture 
    { 
     public CustomWasapiLoopbackCapture() 
      : this(GetDefaultLoopbackCaptureDevice()){ } 
     public CustomWasapiLoopbackCapture(MMDevice captureDevice) 
      : this(captureDevice, false){ } 
     public CustomWasapiLoopbackCapture(MMDevice captureDevice, bool useEventSync) 
      : this(captureDevice, useEventSync, 100){ } 
     public CustomWasapiLoopbackCapture(MMDevice captureDevice, bool useEventSync, int audioBufferMillisecondsLength) 
      : base(captureDevice, useEventSync, audioBufferMillisecondsLength){ } 

     public static MMDevice GetDefaultLoopbackCaptureDevice() { 
      MMDeviceEnumerator devices = new MMDeviceEnumerator(); 
      return devices.GetDefaultAudioEndpoint(DataFlow.Render, Role.Multimedia); 
     } 

     public override WaveFormat WaveFormat 
     { 
      get { return base.WaveFormat; } 
      set { throw new InvalidOperationException("WaveFormat cannot be set for WASAPI Loopback Capture"); } 
     } 

     protected override AudioClientStreamFlags GetAudioClientStreamFlags() { 
      return AudioClientStreamFlags.Loopback; 
     } 
    } 
} 

我怎樣才能訪問轉換的記錄?我認爲通過添加這些提供程序,我可以獲取數據以供進一步計算。我的假設args.Buffer沒有提供期望的PCM 16bit 44100 kHz Mono數據來自我在recorderDataAvailable方法中做的額外處理的不切實際的結果。我用一個簡單的WaveInEvent對我的混音器板上的另一個輸入端進行了測試,我用這個簡單的WaveInEvent再次播放了播放聲音。

回答

1

WASAPI始終將音頻記錄爲IEEE浮點樣本。因此,在回調中記錄的緩衝區中,每4個字節是一個float。訪問個別樣本的簡單方法是使用BitConverter.ToSingle。這會給你一個範圍+/- 1.0的值。所以乘以32767,然後轉換爲Int16將其轉換爲16位採樣值。

+0

你能詳細點嗎?我試圖將錄製的緩衝區複製到System.IO.Stream中,以便將它傳遞給SpeechRecognitionEngine.SetInputToAudioStream()。我試過AcmStream重新採樣,但我覺得這可能不是必要的。 SRE只接受8或16位採樣。基本上試圖做[這](https://stackoverflow.com/questions/1682902/streaming-input-to-system-speech-recognition-speechrecognitionengine),但與WasapiLoopbackCapture()。 – javon27