good stuff. I had assumed syncContext would be null on web servers. It was just added as a convenient way to make WinForms and WPF work.
Mark
good stuff. I had assumed syncContext would be null on web servers. It was just added as a convenient way to make WinForms and WPF work.
Mark
Hi,
I think the library NAudio does not support 32-bit PCM audio, so here is my modest contribution by adding a class Pcm32BitsToSampleProvider and a small change in the class SampleProviderConverters.
Sorry for the quality of my english;-)
/* Complement Format PCM * * ajout dans SampleProviderConverters.cs * * dans ConvertWaveProviderIntoSampleProvider * * else if (waveProvider.WaveFormat.BitsPerSample == 32) * { * sampleProvider = new Pcm32BitToSampleProvider(waveProvider); * } * */ namespace NAudio.Wave.SampleProviders { /// <summary> /// Converts an IWaveProvider containing 32 bit PCM to an /// ISampleProvider /// </summary> public class Pcm32BitToSampleProvider : SampleProviderConverterBase { /// <summary> /// Initialises a new instance of Pcm32BitToSampleProvider /// </summary> /// <param name="source">Source Wave Provider</param> public Pcm32BitToSampleProvider(IWaveProvider source) : base(source) { } /// <summary> /// Reads floating point samples from this sample provider /// </summary> /// <param name="buffer">sample buffer</param> /// <param name="offset">offset within sample buffer to write to</param> /// <param name="count">number of samples required</param> /// <returns>number of samples provided</returns> public override int Read(float[] buffer, int offset, int count) { int sourceBytesRequired = count * 4; EnsureSourceBuffer(sourceBytesRequired); int bytesRead = source.Read(sourceBuffer, 0, sourceBytesRequired); int outIndex = offset; for (int n = 0; n < bytesRead; n += 4) { buffer[outIndex++] = (((sbyte)sourceBuffer[n+3] << 24 |
sourceBuffer[n + 2] << 16) |
(sourceBuffer[n + 1] << 8) |
sourceBuffer[n]) / 2147483648f;
}
return bytesRead / 4;
}
}
}
And modify SampleProviderConverters
... } else if (waveProvider.WaveFormat.BitsPerSample == 32) { sampleProvider = new Pcm32BitToSampleProvider(waveProvider); } else { throw new InvalidOperationException("Unsupported operation"); } ...
Best Regards, Manu.N
Hi Mark,
I just wanted to say thanks again for sharing NAudio. I'm new to programming, and I'm doing it as a hobby, and to learn some stuff. I really don't know that much about c# yet, but I do see that it's similar to visual basic.
From studying the demo you have, and looking at the tutorials that giawa has, I've managed to put together a test using VB.net
I can open, and play a wav file, and select my asio driver. I'm using a Lynx Studio AES 16 sound card (buffer is set 512).
When I hit stop, it stutters for a few ms. I know I'm missing something, but I can't figure out what. Am I not disposing the asio driver correctly?
Any help would greatly be appreciated.
Many Thanks, Wyatt
Here's my code:
Imports NAudio.Wave Public Class Form1 Private wave As NAudio.Wave.WaveFileReader = Nothing Private output As NAudio.Wave.AsioOut = Nothing Public Sub New() InitializeComponent() InitialiseAsioControls() End Sub Public ReadOnly Property SelectedDeviceName() As String Get Return CStr(comboBoxAsioDriver.SelectedItem) End Get End Property Private Sub InitialiseAsioControls() ' Just fill the comboBox AsioDriver with available driver names Dim asioDriverNames = AsioOut.GetDriverNames() For Each driverName As String In asioDriverNames comboBoxAsioDriver.Items.Add(driverName) Next driverName comboBoxAsioDriver.SelectedIndex = 0 End Sub Private Sub btnOpen_Click(sender As System.Object, e As System.EventArgs) Handles btnOpen.Click Dim wavFile As String OpenWavFile.InitialDirectory = "" wavFile = OpenWavFile.ShowDialog() wavFile = OpenWavFile.FileName Label1.Text = wavFile End Sub Private Sub btnPlay_Click(sender As System.Object, e As System.EventArgs) Handles btnPlay.Click If Label1.Text = Nothing Then Dim wavFile As String OpenWavFile.InitialDirectory = "" wavFile = OpenWavFile.ShowDialog() wavFile = OpenWavFile.FileName Label1.Text = wavFile Else wave = New NAudio.Wave.WaveFileReader(Label1.Text) output = New AsioOut(comboBoxAsioDriver.Text) output.Init(New NAudio.Wave.WaveChannel32(wave)) output.Play() btnPlay.Enabled = False End If End Sub Private Sub DisposeWave() If output IsNot Nothing Then If output.PlaybackState = NAudio.Wave.PlaybackState.Playing Then output.Stop() End If output.Dispose() output = Nothing End If If wave IsNot Nothing Then wave.Dispose() wave = Nothing End If End Sub Private Sub btnStop_Click(sender As System.Object, e As System.EventArgs) Handles btnStop.Click DisposeWave() btnPlay.Enabled = True End Sub End Class
I think the NAudio library does not support the audio format IEEE 64bit, so here is my modest contribution by adding a class WaveToSampleProvider64 and a small change in the class SampleProviderConverters.
using System; namespace NAudio.Wave.SampleProviders { /// <summary> /// Helper class turning an already 64 bit floating point IWaveProvider /// into an ISampleProvider - hopefully not needed for most applications /// </summary> public class WaveToSampleProvider64 : SampleProviderConverterBase { /// <summary> /// Initializes a new instance of the WaveToSampleProvider class /// </summary> /// <param name="source">Source wave provider, must be IEEE float</param> public WaveToSampleProvider64(IWaveProvider source) : base(source) { if (source.WaveFormat.Encoding != WaveFormatEncoding.IeeeFloat) { throw new ArgumentException("Must be already floating point"); } } /// <summary> /// Reads from this provider /// </summary> public override int Read(float[] buffer, int offset, int count) { int bytesNeeded = count * 8; EnsureSourceBuffer(bytesNeeded); int bytesRead = source.Read(sourceBuffer, 0, bytesNeeded); int samplesRead = bytesRead / 8; int outputIndex = offset; for (int n = 0; n < bytesRead; n+=8) { long sample64 = BitConverter.ToInt64(sourceBuffer, n); buffer[outputIndex++] = (float)BitConverter.Int64BitsToDouble(sample64); } return samplesRead; } } }
and modify class SampleProviderConverters
else if (waveProvider.WaveFormat.Encoding == WaveFormatEncoding.IeeeFloat) { if (waveProvider.WaveFormat.BitsPerSample == 64) sampleProvider = new WaveToSampleProvider64(waveProvider); else sampleProvider = new WaveToSampleProvider(waveProvider); } else { throw new ArgumentException("Unsupported source encoding"); }
Best Regards, Manu.N
don't Dispose straight away after stop. Call output.Stop(), and then in the handler for PlaybackStopped, then you can Dispose the device. Also, please make sure you are using the very latest NAudio (there is a pre-release build up on NuGet)
Thanks for the quick reply. Sorry for being a noob.
Could you give me an example in the test code I have above, or do you mean where I have btnStop
add output.stop() first? I tried this, and it still stutters. I'm still not understanding something.
Private Sub DisposeWave() If output IsNot Nothing Then If output.PlaybackState = NAudio.Wave.PlaybackState.Playing Then output.Stop() End If output.Dispose() output = Nothing End If If wave IsNot Nothing Then wave.Dispose() wave = Nothing End If End Sub Private Sub btnStop_Click(sender As System.Object, e As System.EventArgs) Handles btnStop.Click output.Stop() DisposeWave() btnPlay.Enabled = True End Sub
get rid of the call to output.Dispose(). The ASIO driver is not ready for it yet.
Instead, subscribe to output.PlaybackStopped event. Then in there, call output.Dispose().
Alternatively, only Dispose the device when you press play again or close your form
Thanks again Mark.
In the code above, I got rid of output.Dispose()
I still don't understand "Instead, subscribe to output.PlaybackStopped event."
In addition to what Mark said, I think you will need to keep a reference to the WaveChannel32 object, so you can set .PadWithZeros , which enables the PlaybackStopped event.
Something like this:
Private WithEvents output As NAudio.Wave.AsioOut = Nothing Private mainOutputStream As WaveChannel32 ... mainOutputStream = New WaveChannel32(wave) ' tie the file reader to the output stream mainOutputStream.PadWithZeroes = False ' need this to detect eof output.Init(mainOutputStream) ' tie output stream to output device ... Private Sub PlaybackStopped(sender As Object, e As System.EventArgs) Handles output.PlaybackStopped ' dispose End Sub
TrueSpeechWaveFormat tspf = new TrueSpeechWaveFormat(); using (MemoryStream ms = new MemoryStream(Encoding.Default.GetBytes(dataToPlay))) { RawSourceWaveStream rws = new RawSourceWaveStream(ms, tspf); WaveFormatConversionStream pcmStream = (WaveFormatConversionStream)WaveFormatConversionStream.CreatePcmStream(rws); wo.Init(pcmStream); wo.Play(); }
this is inside a function which gets called at a delay of 85ms. So its consuming too much memory. Whats' the neat way of doing this?
can you provide a simple source?
Use a single BufferedWaveProvider, and whenever you receive data, decompress it and put it into the BufferedWaveProvider. Then you can just play from the bufferedwaveprovider. For a full code example, look at the source code for the network chat application in NAudioDemo (see the Source Code tab above)
Hi, I've been using NAudio for over a month now I and get a ton of use out of it for a project i'm doing. However, I was wondering if there was a way take multiple wav files and overlap them?
ie: place them over top of each other.
I am using Visual c# express 2010
Thank you
Hi, I've been using NAudio for over a month now I and get a ton of use out of it for a project i'm doing. However, I was wondering if there was a way take multiple wav files and overlap them?
ie: place them over top of each other.
I am using Visual c# express 2010
Thank you
For some reason this uploaded 3 times. Very sorry for the accidental spamming, I cant find where to delete them :(
no problem, I can delete them. The mixer streams that NAudio provides can do this. WaveMixerStream32, or MixingSampleProvider. It is always best to go to 32 bit floating point first, mix them together, reduce the volume a bit in case of clipping, then you can go back down to 16 bit PCM, before writing to a WAV file with WaveFileWriter.
Mark
Ok so i make a WaveMixerStream32, add the streams to it, and what do I do with it from there? What is the method that mixes and writes them?
Thanks again for your help Mark, and eejake52.
Looks like I need to start over, or something. I never did get it to work right. No matter what I've tried, it still stutters when I hit the stop button.
I'll still keep trying.
Thanks, Wyatt
no luck with the source code. the buffer is getting full in 1 second. i tried to boolean option to discard data whn buffer is full. The stream is sending data in an interval of 100ms. its playing i think but no sound coming out. cpu usage is 50.
are you sure you're reading from the BufferedWaveProvider?
WaveFileWriter.CreateWavFile will read from a WaveStream and make a file. Make sure WaveMixerStream32.PadWithZeroes is set to false of you will fill up your hard disk