Quantcast
Channel: NAudio
Viewing all 5831 articles
Browse latest View live

New Post: Unable to load wave file from embedded resource...

$
0
0
Try WaveFileReader.toSampleProvider

Created Unassigned: Audible Glitch when Clicking on Certain GUI Elements [16477]

$
0
0
There is an audible glitch when clicking on certain GUI elements such as the window title bar. The audio will momentarily pause. This issue can be easily reproduced with the Signal Generator demo.

System:
Surface Pro 3
Windows 8.1
Visual Studio 2013

New Post: NAudio - audio visualizer C#

$
0
0
Hi there, I know this is an old thread, but I thought I didn't really think it was necessary to create a new thread. What is meant by FFT?

New Post: NAudio - audio visualizer C#

$
0
0
An FFT (Fast Fourier Transform) takes a block of raw Audio samples with a power of 2, to transform it into a Frequency and Amplitude representation. Just imagine, any Signal can be displayed as an overlap of many many Sinus curves:

Animation

This is exactly, what FFT does. It splits the block into many Sinus curves and assumes them periodic. This Sinus calculation is done by vectors. To represent any possible vector, this is done with complex numbers. The power of 2 is just for algorithm optimization, to be a "fast" Fourier transform.

So basically in practice your approach is e.g. like this:

Take 1024 raw Audio samples (from a MeteringSampleProvider ) -> FFT (in a SampleAggregator) -> Complex Array with 2048 samples -> Loop through first (or second, as values are mirrored around zero) 1024 of them like this:
    'NOTE: this Event is fired by the SampleAggregator class.
    Public Sub FftCalculated(ByVal sender As Object, ByVal e As FftEventArgs)
           For i = 0 To e.Result.Length / 2 - 1
                Dim Amplitude As Double = Math.Sqrt(e.Result(i).X ^ 2 + e.Result(i).Y ^ 2)
                Dim Frequency As Double = i * 44100 / e.Result.Length
                'TODO: Paint Frequency and Amplitude here on a control, Bitmap...
                'Some side notes:
                'Max Frequency is half SampleRate. In this case 22050.
                'If you want to get real human reconized values, apply Logarithm on Amplitude and Frequency.
           Next
    End Sub
NOTE: I assumed a SampleRate of 44100 in above Example.

The above Event is fired by the SampleAggregator class, that you can find in a NAudio example. To feed the SampleAggregator I used a MeteringSampleProvider with the StreamVolume Event (where you can also feed a WaveformPainter, etc.):
    'NOTE: This Event is fired by MeteringSampleProvider class.
    Private Sub m_SampleProvider_StreamVolume(ByVal sender As Object, ByVal e As NAudio.Wave.SampleProviders.StreamVolumeEventArgs)

        Dim left As Single = 0
        Dim right As Single = 0

        If e.MaxSampleValues.Length > 0 Then left = e.MaxSampleValues(0)
        'Is not mono Stream?
        If e.MaxSampleValues.Length > 1 Then right = e.MaxSampleValues(1)

        SampleAggregator.Add(0.5 * (left + right))
        'WaveformPainter1.AddMax((left + right) / 2)

    End Sub
Just add the MeteringSampleProvider in your Signal chain to access the raw Audio sample data.

I hope this Points you into the right direction, without confusing you too much with Details.

Kind regards

Freefall


PS: I created a StereoWaveformPainter some time ago based on the WaveformPainter class. Perhaps you want to incorporate it in a future release Mark.

New Post: NAudio - audio visualizer C#

$
0
0
Ah thanks, I understand it better now!

New Post: NAudio - audio visualizer C#

$
0
0
Please mark this thread as answered so others can see quickly for future questions concerning this.

Thanks.

New Post: NAudio - audio visualizer C#

$
0
0

I want OP of the thread, I don't think I can. I was just asking the question in relation to a previous reply to the OP's initial question ah.

New Post: Naudio Buffered Wave Provider plays in cut off chunks

$
0
0
Hello, I am having trouble playing back the audio that I put into the buffered wave provider. It plays back some stuff but the ends are getting cut off. I have checked the data through wireshark and I am getting all the information it's just not playing back correctly.

Here is the code for when the player hits the play button:
private NAudio.Wave.DirectSoundOut waveOut = null;
        //NAudio.Wave.WaveOutEvent waveEvent = null;
        //NAudio.Wave.WaveCallbackInfo callBack = NAudio.Wave.WaveCallbackInfo.FunctionCallback();
        NAudio.Wave.BufferedWaveProvider wavePlayerBuffer;
        //Play Button
        private void button2_Click(object sender, EventArgs e)
        {

            // construct playback buffer
            
            //wavePlayerBuffer = new NAudio.Wave.BufferedWaveProvider(playbackFormat);

            // associate playback buffer with player 
           // waveEvent = new NAudio.Wave.WaveOutEvent();
            waveOut = new NAudio.Wave.DirectSoundOut();
            //waveOut.Init(wavePlayerBuffer); // providing connection to player 
            
            //starts streaming thread once

            CommandMessage cm = new CommandMessage();
            cm.offset = 0;
            cm.songId = 1;

            aClient.requestAuth("PLAY " + cm.offset.ToString() + " LOC " + cm.songId.ToString());

            Byte[] sampleArray = new Byte[4];
            Byte[] channelArray = new Byte[2];
            Byte[] BperS = new Byte[2];
            Byte[] BitA = new Byte[2];
            Byte[] incomingControlData = aClient.headerRiff;

            //if (BitConverter.IsLittleEndian)
            //    Array.Reverse(incomingControlData);

            Array.Copy(incomingControlData, 24, sampleArray, 0, 4);
            Array.Copy(incomingControlData, 22, channelArray, 0, 2);
            Array.Copy(incomingControlData, 34, BperS, 0, 2);
            Array.Copy(incomingControlData, 32, BitA, 0, 2);

            int sampleRate = BitConverter.ToInt32(sampleArray, 0);
            int channel = BitConverter.ToInt16(channelArray, 0);
            int BitDepth = BitConverter.ToInt16(BperS, 0);
            int BlockAllign = BitConverter.ToInt16(BitA, 0);
            NAudio.Wave.WaveFormat playbackFormat = new NAudio.Wave.WaveFormat(8000, 8, 1);//(sampleRate, BitDepth ,channel);

            wavePlayerBuffer = new NAudio.Wave.BufferedWaveProvider(playbackFormat);
            wavePlayerBuffer.BufferDuration = TimeSpan.FromSeconds(360.0);
            wavePlayerBuffer.ClearBuffer(); 

            waveOut.Init(wavePlayerBuffer);
            waveOut.Play();
            if (doOnce)
            {
                Thread thread =
                 new Thread(new ThreadStart(WaitForPackets));
                thread.IsBackground = true;
                thread.Start();
                Streaming = true;
                doOnce = false;
            }

            
            //while (waveOut.PlaybackState == NAudio.Wave.PlaybackState.Playing)
            //{
            //    Thread.Sleep(100);
            //}
        }
And here is my code for my thread for receiving data from my server
 void WaitForPackets()
        {
            while (Streaming)
            {

                try
                {
                    byte[] incomingData = client.Receive(ref receivePoint);

                for (int i = 0; i < 1000; i++)
                {
                    streamBuffer[buffId, i + offset] = incomingData[i];
                }

                offset += 1000;
               
                if (offset == 8000)
                {
                    //creates a one dimensional array to send to be played
                    byte[] chunkReady = new byte[8000];

                    for (int i = 0; i < 8000; i++)
                    {
                        chunkReady[i] = ( (streamBuffer[buffId, i]));
                    }

                    // associated with waveOut object 
                    //MessageBox.Show(wavePlayerBuffer.BufferLength.ToString());
                    Console.WriteLine("Current duration of song" +  wavePlayerBuffer.BufferedDuration.ToString());
                    Console.WriteLine("Buffered Bytes" + wavePlayerBuffer.BufferedBytes.ToString());
                    //Console.WriteLine("Current Read " + wavePlayerBuffer.Read(chunkReady, 0, 8000));
                    Console.WriteLine("Current buffId " + buffId);
                    //
                     //
                     wavePlayerBuffer.AddSamples(chunkReady, 0, 8000);
                     //wavePlayerBuffer.Read(chunkReady,0,8000);
                     wavePlayerBuffer.DiscardOnBufferOverflow = true;
                    // wavePlayerBuffer.BufferDuration = TimeSpan.FromSeconds(buffId + 5);

                     if (startPlay && buffId == 2)
                     {
                         startPlay = false;
                         //Thread.Sleep(200);
                     }
                     
                     // increments the offsets
                     offset = 0;
                     buffId++;
                    
                }
                if (buffId > 50) buffId = 0; 
            }
                catch(Exception ex)
                {
                    MessageBox.Show(ex.ToString());
                }
            }
            
        }

New Post: Naudio Buffered Wave Provider plays in cut off chunks

$
0
0
This is a more updated version of the download and play thread.
 void WaitForPackets()
        {
            while (Streaming)
            {
                //Thread.Sleep(100);
                try
                {
                    byte[] incomingData = client.Receive(ref receivePoint);

                for (int i = 0; i < 1000; i++)
                {
                    streamBuffer[buffId, i + offset] = incomingData[i];
                }

                offset += 1000;
               
                if (offset == 8000)
                {
                    //creates a one dimensional array to send to be played
                    byte[] chunkReady = new byte[8000];

                    for (int i = 0; i < 8000; i++)
                    {
                        chunkReady[i] = ( (streamBuffer[buffId, i]));
                    }

                    // associated with waveOut object 
                    //MessageBox.Show(wavePlayerBuffer.BufferLength.ToString());
                    Console.WriteLine("Current duration of song" +  wavePlayerBuffer.BufferedDuration.ToString());
                    Console.WriteLine("Buffered Bytes" + wavePlayerBuffer.BufferedBytes.ToString());
                   // Console.WriteLine("Current Read " + wavePlayerBuffer.Read(chunkReady, 0, 8000));
                    Console.WriteLine("Current buffId " + buffId);
                    //
                     //
                     wavePlayerBuffer.AddSamples(chunkReady, 0, 8000);
                     //wavePlayerBuffer.Read(chunkReady,0,8000);
                     //wavePlayerBuffer.DiscardOnBufferOverflow = true;
                    // wavePlayerBuffer.BufferDuration = TimeSpan.FromSeconds(buffId + 5);

                     //if (waveOut.GetPosition() < wavePlayerBuffer.BufferedBytes)
                     //    waveOut.Pause();
                     //else if (waveOut.PlaybackState == NAudio.Wave.PlaybackState.Paused)
                     //    waveOut.Play();
                       

                     //if (startPlay && buffId == 2)
                     //{

                     //    startPlay = false;
                     //    //Thread.Sleep(200);
                     //}
                     
                     // increments the offsets
                     offset = 0;
                     buffId++;
                    
                }
                if (buffId > 9) buffId = 0; 
            }
                catch(Exception ex)
                {
                    MessageBox.Show(ex.ToString());
                }
            }
            
        }

New Post: Jack/Headphone listener

$
0
0
Hi,
i'm trying to use naudio library to listen headphone state (plugged/unplugged).
At moment i've found only a class/method (MMDeviceEnumerator.cs) to check the headphone state, there is a way to use a listener in order to dispatch an event when the user plug or unplug the headphones ????
Thanks in advice,
Roberta.

New Post: Fire-and-forget game audio.

$
0
0
I'm trying to use NAudio to play sound in a game context, using http://mark-dot-net.blogspot.co.uk/2014/02/fire-and-forget-audio-playback-with.html as my guide. I've run into a bit of a snag, though. Playing a sound seems to set a playback position in the provider object, and I'm unclear on how to reset that or make a copy decoupled from the cached sound. This means that trying to play two instances of the same sound overlapping doesn't start a new sound but rather doubles the existing one, and once a sound has played once it effectively can't be played again.

Is this indicative of me having done something wrong, or intended behavior? My implementation passes sound around as sample providers, and I've gotten a sense from looking around that handling wave providers instead would maybe avoid this. Is that a red herring, or a good lead?

I'd post sample code, but I'm not sure quite what part would be useful to excerpt, and the current state is a bit messy with attempted fixes and work-arounds, so I thought it'd be better to ask those prelim questions first.

New Post: Getting unspecified error while executing simple command line app

$
0
0
I have a similar error occurring, with details as noted:

I have an application using Naudio 1.7.3 (or 1.7.2 with the same result) that runs fine on a laptop running Windows 7, but does not run on Windows 8.1 desktop machine using the Realtek High Definition Audio driver, quitting with the error "NAudio.MmException UnspecifiedError calling waveOutOpen"

Both the laptop and the desktop use Realtek High Definition Audio. The laptop also has ATI HDMI Audio, and the Desktop also has NVIDIA High Definition Audio, and the Desktop also has Virtual Audio Cable. But both machines are using the Realtek Drivers with the application, and the other audio devices are not being used. I still get the same error if I disable the NVIDIA devices and the Virtual Audio Cable Devices.

If I plug into the Windows 8.1 Desktop a 3 - Media USB Audio Device and make that my default device instead of the Realtek device, and if I use that device with the app instead of the Realtek device, then the code runs fine on the Windows 8.1 machine with no error and perfect audio.

My question is, why does the Realtek device not work with my code, and what can be done about it?

I have the latest drivers for all devices.

The problematic code is:
            public static BufferedWaveProvider receiveProvider2;  //this buffers receive audio to prevent clicks/dropouts
        public static MixingWaveProvider32 mixer;  //this mixes receive audio and sidetone audio into common waveOut stream
            public static Wave16ToFloatProvider receiveIEEE; //this converts PCM16 receive audio to IEEE floating point needed by mixer
            public static SineWaveProvider32 sineWaveProvider; //this is CW sidetone generator

        sineWaveProvider = new SineWaveProvider32();
                    sineWaveProvider.SetWaveFormat(48000, 1); //48kHz mono
                    sineWaveProvider.Frequency = 1000;
                    sineWaveProvider.Amplitude = 0.0f;
                    sidetonevol = (float)VolumeTrackBar.Value / (float)VolumeTrackBar.Maximum;

                    mixer = new MixingWaveProvider32();
                    receiveProvider2 = new BufferedWaveProvider(new WaveFormat(48000, 16, 1));
                    receiveProvider2.DiscardOnBufferOverflow = true;
                    receiveProvider2.BufferDuration = TimeSpan.FromSeconds(0.25);
                    receiveIEEE = new Wave16ToFloatProvider(receiveProvider2);

                    mixer.AddInputStream(receiveIEEE);
                    mixer.AddInputStream(sineWaveProvider);
                    waveOut1 = new WaveOut() { DesiredLatency = 100 };
                    waveOut1.Init(mixer);
                    waveOut1.Play();
The error, as noted occurs at the waveOut1.Play() statement.

An alternate, simpler form of the code does run on BOTH machines without error:
            public static BufferedWaveProvider receiveProvider2;
        receiveProvider2 = new BufferedWaveProvider(new WaveFormat(48000,16,1));
                    receiveProvider2.DiscardOnBufferOverflow = true;
                    receiveProvider2.BufferDuration = TimeSpan.FromSeconds(20);
                    waveOut1 = new WaveOut(){DesiredLatency = 100};
                    waveOut1.Init(receiveProvider2);
                    waveOut1.Play();
Thanks in advance for your thoughts,

Roger
W3SZ

The detailed error message is:

NAudio.MmException was unhandled
_HResult=-2146233088
_message=UnspecifiedError calling waveOutOpen
HResult=-2146233088
IsTransient=false
Message=UnspecifiedError calling waveOutOpen
Source=NAudio
StackTrace:
   at NAudio.MmException.Try(MmResult result, String function)
   at NAudio.Wave.WaveOut.Init(IWaveProvider waveProvider)
   at KISS_Konsole.Form1.OnOffButton_Click(Object sender, EventArgs e) in f:\HPSDR_KISS_CONSOLE_Server_NEWEST\Latest - Copy\Unified\Form1.cs:line 4696
   at System.Windows.Forms.Control.OnClick(EventArgs e)
   at System.Windows.Forms.Button.OnClick(EventArgs e)
   at System.Windows.Forms.Button.OnMouseUp(MouseEventArgs mevent)
   at System.Windows.Forms.Control.WmMouseUp(Message& m, MouseButtons button, Int32 clicks)
   at System.Windows.Forms.Control.WndProc(Message& m)
   at System.Windows.Forms.ButtonBase.WndProc(Message& m)
   at System.Windows.Forms.Button.WndProc(Message& m)
   at System.Windows.Forms.Control.ControlNativeWindow.OnMessage(Message& m)
   at System.Windows.Forms.Control.ControlNativeWindow.WndProc(Message& m)
   at System.Windows.Forms.NativeWindow.DebuggableCallback(IntPtr hWnd, Int32 msg, IntPtr wparam, IntPtr lparam)
   at System.Windows.Forms.UnsafeNativeMethods.DispatchMessageW(MSG& msg)
   at System.Windows.Forms.Application.ComponentManager.System.Windows.Forms.UnsafeNativeMethods.IMsoComponentManager.FPushMessageLoop(IntPtr dwComponentID, Int32 reason, Int32 pvLoopData)
   at System.Windows.Forms.Application.ThreadContext.RunMessageLoopInner(Int32 reason, ApplicationContext context)
   at System.Windows.Forms.Application.ThreadContext.RunMessageLoop(Int32 reason, ApplicationContext context)
   at System.Windows.Forms.Application.Run(Form mainForm)
   at KISS_Konsole.Program.Main(String[] args) in f:\HPSDR_KISS_CONSOLE_Server_NEWEST\Latest - Copy\Unified\Program.cs:line 33
   at System.AppDomain._nExecuteAssembly(RuntimeAssembly assembly, String[] args)
   at System.AppDomain.ExecuteAssembly(String assemblyFile, Evidence assemblySecurity, String[] args)
   at Microsoft.VisualStudio.HostingProcess.HostProc.RunUsersAssembly()
   at System.Threading.ThreadHelper.ThreadStart_Context(Object state)
   at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
   at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
   at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state)
   at System.Threading.ThreadHelper.ThreadStart()
InnerException:

New Post: How to play a wave-file with a custom format?

$
0
0
Excuse me, could you help me please?
I need to play a wave with a custom format:
AverageBytesPerSecond   60638   int
BitsPerSample   4   int
BlockAlign  44  int
Channels    2   int
Encoding    Adpcm   NAudio.Wave.WaveFormatEncoding
ExtraSize   32  int
SampleRate  44101   int
How I can do that? When I tried, I got errors:
AcmNotPossible calling acmStreamOpen
or
"Unsupported source encoding"

Code:
WaveFormat format = ReadWaveFormat();

byte[] data = ReadData();
MemoryStream ms = new MemoryStream(data, 0, data.Length, false);
WaveStream waveStream = new RawSourceWaveStream(ms, format);

SampleChannel sampleChannel = new SampleChannel(_waveStream , true); // error
MeteringSampleProvider postVolumeMeter = new MeteringSampleProvider(sampleChannel);

IWavePlayer wavePlayer = GetPlayer();
wavePlayer.Init(sampleProvider);
wavePlayer.Play();

New Post: Windows 8 store app - concatenate wav files

$
0
0
Hello, I'm absoloutely new developer in c#.

I saw this answer by Mark Heath:

http://stackoverflow.com/questions/6777340/how-to-join-2-or-more-wav-files-together-programatically

I tried to use it with win8 store app but I saw that several classes like WaveFileReader and others are unavailable.

I have two wav files each in 0.5sec duration. I try to play a merged new wav that contains those files according to a binary string("10101011101"-for example) : 1=file_1.wav, 0=file_2.wav

Is there anyway using NAudio to make this possible?

Thanks

New Post: Write mp3 file to serial port to play on other side on phone call

$
0
0
Hi I am trying to write data from mp3 file to serial port so that i can play this on otherside of phone call . But i can only listen some noise. Following is my code :

bufferedWaveProviderout = new BufferedWaveProvider(new WaveFormat(8000, 16, 1));
        // try
        {
            //    dynamic audioBufferSize = 320;
            //  int offset = 0;
            //  byte[] buffer2 = new byte[320];

            Mp3FileReader sf = new Mp3FileReader("d:\\a.mp3");
            byte[] fileByte = new byte[sf.Length + 1];
            sf.Read(fileByte, 0, Convert.ToInt32(sf.Length));

            int max = fileByte.Length;
            int playingyt = 0;
            while (playingyt <= max)
            {

                byte[] buf = fileByte.Skip(playingyt).Take(1600).ToArray();

                byte[] source = buf;
                Stream byteStream12 = new MemoryStream(source);
                WaveFormat waveFormat = new WaveFormat(8000, 16, 1);

                RawSourceWaveStream rawSourceWaveStream = new RawSourceWaveStream(byteStream12, waveFormat);
                byte[] bt = ConvertNonSeekableStreamToByteArray(byteStream12);
                bufferedWaveProviderout.AddSamples(bt, 0, bt.Length);

                var resampleStream = new AcmStream(new WaveFormat(8000, 16, 1), WaveFormat.CreateCustomFormat(WaveFormatEncoding.MuLaw, 8000, 1, 8000 * 1, 1, 8));

                byte[] source1 = bt;
                Buffer.BlockCopy(source1, 0, resampleStream.SourceBuffer, 0, source1.Length);
                int sourceBytesConverted = 0;
                var convertedBytes = resampleStream.Convert(source1.Length, out sourceBytesConverted);
                if (sourceBytesConverted != source1.Length)
                {
                    Console.WriteLine("We didn't convert everything {0} bytes in, {1} bytes converted");
                }

                var converted = new byte[convertedBytes];
                Buffer.BlockCopy(resampleStream.DestBuffer, 0, converted, 0, convertedBytes);

                MemoryStream byteStream = new MemoryStream(converted);


                Stream byteStream122 = new MemoryStream(converted);

                dynamic audioBufferSize = 320;
                int offset = 0;
                //  byte[] buffer = new byte[audioBufferSize - 1];// (audioBufferSize - 1) {}
                byte[] buffer2 = new byte[320];

                while (byteStream122.Read(buffer2, offset, audioBufferSize - offset) > 0)
                {

                    {

                        {
                            _spManager.WriteVoice(buffer2, offset, buffer2.Length - offset);

                            PauseForMilliSeconds(20);
                        }

                    }
                }

                playingyt = playingyt + 1600;
            }





        }
Thanks in advance

New Post: USB headsets: how to associate WaveIn/WaveOut device numbers with USB DevicePath?

$
0
0
I had a similar problem with a project at work. I needed to associate a microphone and earphone into a headset object. You can see my solution on StackOverflow.

Hope this will help you.

New Post: DummyOut

$
0
0
Hi,

I need an dummy IWavePlayer that can be replaced with WaveOut. It should act as a dummy for WaveOut, it should not send data to its output device, but it shall raise the playbackstopped event when the playback stops (after the duration of the sound file).

Does this already exist, or anyone has any idea where to start?

New Post: DummyOut

$
0
0
sorry, doesn't exist, but I'd kick of a thread when playback starts that every 100ms reads 100ms of audio data from the source provider passed to Init. Then it can raisePlaybackStopped when Read returns 0, or when the user calls Stop.

New Post: DummyOut

$
0
0
Yeah, i went that way, works like a charm

New Post: How to play a wave-file with a custom format?

$
0
0
Hi, you could try this instead of SampleChannel:
SampleProvider = RawSourceWaveStream.ToSampleProvider();
Although, this seems to be an ACM error which indicates that the Audio Compression Manager isn´t capable of your self-created waveformat and fails converting it.

Greetz
Viewing all 5831 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>