Quantcast
Channel: NAudio
Viewing all 5831 articles
Browse latest View live

New Post: How to detect silence?

$
0
0
Thank you for reply.
Is there any libraries for filter noise?

New Post: How to detect silence?

$
0
0
I don´t know any .NET based.

But perhaps you can achive better results when you reduce overall volume by ~10-20% (which would discard some audio information as well). Anyway not possible without certain quality loss.

New Post: Splicing part of a wav to a new file

$
0
0
I have a source wav file that is a combination of multiple recordings. I know the start + end time of each of these recordings (as TimeSpans). How would I open the original wav file and then save a range of it to a new file so that I can split it back up into multiple files?

New Post: Reading chunks of wave file data

$
0
0
I'm reading wave files into memory, and they have a special "korg" chunk that contains the name of the sample and other information. Can NAudio help me do this? I saw there's a WaveFileChunkReader class, but it's private. Thanks!

New Post: Reading chunks of wave file data

$
0
0
Never mind, found it. WaveFileChunkReader is wrapped in WaveFileReader.

New Post: Splicing part of a wav to a new file

$
0
0
This classes should do it:

WaveFileReader -> ToSampleProvider -> OffsetSampleProvider (specify offset timespans here)
WaveFileWriter

New Post: Question about Universal Apps

$
0
0
Hi friends,


I could not find a proper forum which I why I am asking here. I use the library in WPF without issues, but after porting to Universal Apps I get stutters in the audio. I've been fiddling with some of the parameters but I don't really know much about how this works.

Hope you can help. Thanks a lot
    public void Play(string uri)
        {
            HttpWebRequest request = (HttpWebRequest)WebRequest.Create(uri);
            request.BeginGetResponse(new AsyncCallback(GetResponseCallback), request);

        }
        private void GetResponseCallback(IAsyncResult asynchronousResult)
        {
            HttpWebRequest request = (HttpWebRequest)asynchronousResult.AsyncState;
            HttpWebResponse response = (HttpWebResponse)request.EndGetResponse(asynchronousResult);
            Stream responseStream = response.GetResponseStream();
            StreamReader streamRead = new StreamReader(responseStream);

            var waveFormat = new WaveFormat(8000, 16, 1);
            var bufferedWaveProvider = new BufferedWaveProvider(waveFormat);
            bufferedWaveProvider.BufferDuration = TimeSpan.FromMilliseconds(200);

            var waveOut = new WasapiOutRT(AudioClientShareMode.Shared, 100);
            waveOut.Init(() => bufferedWaveProvider);
            waveOut.Play();

            byte[] buffer = new byte[800];
            while (true)
            {
                var sizeRead = streamRead.BaseStream.Read(buffer, 0, buffer.Length);
                if (sizeRead > 0 && bufferedWaveProvider.BufferLength - bufferedWaveProvider.BufferedBytes > sizeRead)
                {
                    bufferedWaveProvider.AddSamples(buffer, 0, sizeRead);
                }
            }
        }

New Post: Question about Universal Apps

$
0
0
Just for comparison, here's the code I use in WPF.
var webRequest = WebRequest.Create(uri);

                using (var webResponse = (HttpWebResponse)webRequest.GetResponse())
                {
                    using (var responseStream = new StreamReader(webResponse.GetResponseStream()))
                    {
                        var waveFormat = new WaveFormat(8000, 16, 1);

                        var bufferedWaveProvider = new BufferedWaveProvider(waveFormat);
                        bufferedWaveProvider.BufferDuration = TimeSpan.FromMilliseconds(200);

                        var waveOut = new WaveOutEvent();
                        waveOut.DesiredLatency = 100;
                        waveOut.Init(bufferedWaveProvider);
                        waveOut.Play();

                        byte[] buffer = new byte[800];
                        while (true)
                        {
                            if (bufferedWaveProvider.BufferLength - bufferedWaveProvider.BufferedBytes > buffer.Length)
                            {
                                if (responseStream.Peek() > -1 && responseStream.BaseStream.Read(buffer, 0, buffer.Length) > 0)
                                {
                                    bufferedWaveProvider.AddSamples(buffer, 0, buffer.Length);
                                    await Task.Delay(1, cancellationToken.Token);
                                }
                            }

                        }
                    }
                }

New Post: Question about Universal Apps

$
0
0
Solved!


After much fiddling I decided to try to play a wave file (16bit 8000hz PCM stream) from disk. The file played correctly but at twice the speed, then I changed the waveFormat to 4000hz on the WaveOut and it worked!

Here's the code.
            HttpWebRequest request = (HttpWebRequest)asynchronousResult.AsyncState;
            HttpWebResponse response = (HttpWebResponse)request.EndGetResponse(asynchronousResult);
            Stream responseStream = response.GetResponseStream();
            StreamReader streamRead = new StreamReader(responseStream);

            var waveFormat = new WaveFormat(4000, 16, 1);
            var bufferedWaveProvider = new BufferedWaveProvider(waveFormat);
            bufferedWaveProvider.BufferDuration = TimeSpan.FromMilliseconds(1000);

            var waveOut = new WasapiOutRT(AudioClientShareMode.Shared, 100);
            waveOut.Init(() => bufferedWaveProvider);
            waveOut.Play();

            byte[] buffer = new byte[800];
            while (true)
            {
                if (bufferedWaveProvider.BufferLength - bufferedWaveProvider.BufferedBytes > buffer.Length)
                {
                    var bufferSize = streamRead.BaseStream.Read(buffer, 0, buffer.Length);
                    if (bufferSize > 0 && bufferedWaveProvider.BufferLength - bufferedWaveProvider.BufferedBytes > bufferSize)
                        bufferedWaveProvider.AddSamples(buffer, 0, bufferSize);
                }
            }
Please note that this provided correct audio for a 8000hz audio stream from disk and also a 8000hz network audio stream. If you have suggestions on improving this code, by all means. Thanks!

New Post: Question about Universal Apps

$
0
0
More likely the correct WaveFormat should have been 8000,8,1 or maybe 8000,16,2, but glad to hear you got something working

New Post: Question about Universal Apps

$
0
0
hi Mark,


I have just tried both and they don't work. I've also tried them in WPF with similar results. It strikes me as odd that the code is nearly identical now with regard to NAudio and I could only get it to work using 4kHz.

Since I am a simple library consumer I don't really care though. Would you like me to prepare a case using a RAW wav and two example projects? One for WPF and one for Universal Apps?

Created Unassigned: DirectSoundOut fails if WaveFormat is WaveFormatExtraData [16487]

$
0
0
If the WaveFormat of the wave provider used as the parameter for Init is of type WaveFormatExtraData
The call within InitializeDirectSound to GCHandle.Alloc fails with an ArgumentException complaining that the “Object contains non-primitive or non-blittable data.”.

This is due to the extraData array of bytes field. Although it is a fixed size, alloc is still unable to pin the array because the array type itself is non-primitive and so cannot be pinned.


public TestPlay(Stream in)
{
var waveStream = new WaveFileReader(in);
var player = new DirectSoundOut(DirectSoundOut.DSDEVID_DefaultPlayback);
player.Init(waveStream);
player.Play();
//Waiting for play to stop removed for brevity.
}

Commented Unassigned: DirectSoundOut fails if WaveFormat is WaveFormatExtraData [16487]

$
0
0
If the WaveFormat of the wave provider used as the parameter for Init is of type WaveFormatExtraData
The call within InitializeDirectSound to GCHandle.Alloc fails with an ArgumentException complaining that the “Object contains non-primitive or non-blittable data.”.

This is due to the extraData array of bytes field. Although it is a fixed size, alloc is still unable to pin the array because the array type itself is non-primitive and so cannot be pinned.


public TestPlay(Stream in)
{
var waveStream = new WaveFileReader(in);
var player = new DirectSoundOut(DirectSoundOut.DSDEVID_DefaultPlayback);
player.Init(waveStream);
player.Play();
//Waiting for play to stop removed for brevity.
}
Comments: I think the simple solution here is to replace the GCHandle.Alloc with a pair of Marshal.AllocHGlobal and Marshal.StructureToPtr calls. bufferDesc2.lpwfxFormat = Marshal.AllocHGlobal(Marshal.SizeOf(waveFormat)); try { Marshal.StructureToPtr(waveFormat, bufferDesc2.lpwfxFormat, false); bufferDesc2.guidAlgo = Guid.Empty; // Create SecondaryBuffer directSound.CreateSoundBuffer(bufferDesc2, out soundBufferObj, IntPtr.Zero); secondaryBuffer = (IDirectSoundBuffer)soundBufferObj; } finally { Marshal.DestroyStructure(bufferDesc2.lpwfxFormat, waveFormat.GetType()); Marshal.FreeHGlobal(bufferDesc2.lpwfxFormat); } I found a similar problem in WmaWriter, but I am not using it so I have not tested the problem or fix. But I think something like this would work. mt.pbFormat = Marshal.AllocHGlobal(Marshal.SizeOf(inputFormat)); try { Marshal.StructureToPtr(inputFormat, mt.pbFormat, false); InpProps.SetMediaType(ref mt); } finally { Marshal.DestroyStructure(mt.pbFormat, inputFormat.GetType()); Marshal.FreeHGlobal(mt.pbFormat); }

New Post: Stereo Pairs

$
0
0
I'm using ASIO out but all of my outputs are showing as mono. Is there a way to make the stereo pairs list and be addressed as stereo pairs like 1/2?

Thanks.

New Post: Stereo Pairs

$
0
0
With ASIO you open all the inputs and outputs on the card and write to them yourself. NAudio has a MultiplexingWaveProvider (and MultiplexingSampleProvider) class that can make life a bit easier for you,

New Post: Stereo Pairs

$
0
0
Thanks Mark. Now I only need to figure out how to point my ouput selection (ComboOutput.SelectedIndex) to be the target for playback. No matter what I do in that bottom code block (empty here) it still plays through the default audio device.. Suffice to say, the C# examples are not that clear to me, a VB guy.

Forgetting the larger C# classes for now, I simply need to play a file (filename) out of Device X & Output X, via ASIO in that one code block. I'd appreciate any example.
Private Sub ComboASIO_SelectedIndexChanged(sender As Object, e As EventArgs) Handles ComboASIO.SelectedIndexChanged
    ' Hold the current device index in case of failure.
    Dim CurrentIndex = ComboASIO.SelectedIndex

    Try
        Dim Asio = New AsioOut(ComboASIO.SelectedIndex)
        Dim outputs = Asio.DriverOutputChannelCount

        ' load the outputs from the selected device
        ComboASIOOutputs.Items.Clear()
        For output = 0 To outputs - 1
            ComboASIOOutputs.Items.Add(Asio.AsioOutputChannelName(output))
        Next

        If ComboASIOOutputs.Items.Count > 0 Then ComboASIOOutputs.SelectedIndex = 0

    Catch ex As Exception
        MsgBox(ComboASIO.Text & " Error: " & ex.Message, vbOKOnly + vbInformation, "Set ASIO Device")
        Exit Sub
    End Try
End Sub

Private Sub ComboASIOOutputs_SelectedIndexChanged(sender As Object, e As EventArgs) Handles ComboASIOOutputs.SelectedIndexChanged

    '**** need code here to make the selected output (or output pair) the target for playback.

End Sub

New Post: Stereo Pairs

$
0
0
I figured it out. "Channel Offset" to hit the other pairs. Now I only need to set the volume before it hits the ASIO part, since (I guess) volume isn't supported in ASIO there.
    wave = New NAudio.Wave.WaveFileReader(Filename)

    Dim output = New AsioOut(ComboASIO.Text)
    output.ChannelOffset = ComboASIOOutputs.SelectedIndex
    output.Init(New NAudio.Wave.WaveChannel32(wave))
    output.Play()
And the mono/stereo thing is irrelevant, hitting the left hits both anyway so it's a non issue.

Thanks a lot for your help Mark

New Post: An Event to Fire to Dispose?

$
0
0
My app is playing really short audio samples and I can't really use a "stop" button to dispose of the reader. My thought was to run a timer and then parse a property to dispose once a file is finished playing but there doesn't seem to be any event or property that triggers in that case with ASIO out.

Even after the file is done playing the "Playing" property is still true. I expected it to be false, that it would automatically "stop" at the end of the file, so that my timer could parse that property and automatically dispose of it.

Any idea? Thanks.

New Post: Clipping wrap-around

$
0
0
I'm having a major problem with NAudio. I am currently using it to create audio effects, and sometimes my effects cause the audio to jump up to very high levels - above an amplitude of 1. Now in these instances, I expect that my audio will be automatically clipped to a value of 1 when it is saved to a WAV file, as this is what occurs in other audio applications. However, in NAudio, any samples that are exactly 1 or greater than one, end up being clipped to a value of -1. This is a very big problem. My current workaround is to set my application to automatically clip all samples at 0.99999 right before I save them. Perhaps I did something wrong, but I really think this is a critical issue.
Thanks for your time.

T

New Post: Error in playing compressed wave file using waveoutevent player.

$
0
0

Hello Sir,

I have developed one player using NADUIO and its working fine, but now I have got a small problem. where the compressed Wave file cannot be played through waveoutevent player, as I am getting an error stating “NoDriver calling acmFormatSuggest”. So, Kindly let me know how to play the compressed Wave file using WAVEOUTEVENT player.

Thank’s in Advance!!!

Thank You.

Athivarathan.S

+91-9500351051

Viewing all 5831 articles
Browse latest View live