Quantcast
Channel: NAudio
Viewing all 5831 articles
Browse latest View live

New Post: Weird slowing down of audio, when i use Mp3StreamingDemo

$
0
0
Thx for your reply. Can you give me a little example, how can I do this, please?

New Post: Detect full download in Mp3StreamingDemo

$
0
0
Hi!
How can I detect that mp3 downloading from HttpWebResponse is over in NAudioDemo -> Mp3StreamingDemo or how much bytes were downloaded at the time? I've tried to compare readfullystream.Position and httpwebresponse.ContentLength, but in some reason ContentLength is always bigger, than Position, although ContentLength is a length of mp3 file byte array. Manipulations with bufferedWaveProvider don't work too.

Pic related

New Post: Calling WmaWriter.Write in data available event causes recording from microphone to stop

$
0
0
I have a signal chain where I mix the audio from a headset microphone with the headset speakers and write the audio data to a .wma or a .wav file block by block. This does work if I write to a .wav file. If I use WmaWriter, when the microphone data available event occurs, and call WmaWriter.Write in the event handler function with the block of data received, immediately my code receives a recording stopped event from the microphone.

Here is my code:
        /// <summary>
        /// Records Mic and Audio to separate files, combines to a single WAV
        /// and calls the WMA conversation method
        /// </summary>
        public void StartRecording()
        {
            if (!_disposed)
            {
                if (!Directory.Exists(SaveLocation))
                {
                    Directory.CreateDirectory(SaveLocation);
                }
                CurrentDateTime = DateTime.UtcNow.ToString("yyyy_MM_dd_HH_mm_ss_fff");
                SpeakerAudioRecordingStopped = false;
                MicAudioRecordingStopped = false;

                _microphone = new WasapiCapture();
                _microphone.ShareMode = AudioClientShareMode.Shared;
                _microphone.WaveFormat = new WaveFormat(44100, 16, 2);

                _speakers = new WasapiLoopbackCapture();
                _speakers.ShareMode = AudioClientShareMode.Shared;

                _micAudioBuffer = new BufferedWaveProvider(_microphone.WaveFormat);
                _micAudioBuffer.DiscardOnBufferOverflow = true;
                _speakersAudioBuffer = new BufferedWaveProvider(_speakers.WaveFormat);
                _speakersAudioBuffer.DiscardOnBufferOverflow = true;

                _microphone.DataAvailable += _microphone_DataAvailable;
                _microphone.RecordingStopped += _microphone_RecordingStopped;

                _speakers.DataAvailable += _speakersAudio_DataAvailable;
                _speakers.RecordingStopped += _speakersAudio_RecordingStopped;

                _mixer = new MixingSampleProvider(WaveFormat.CreateIeeeFloatWaveFormat(44100, 1));
                _mixer.ReadFully = true;

                _micAudioReSampler = new MediaFoundationResampler(_micAudioBuffer, new WaveFormat(44100, 1)) { ResamplerQuality = 60 };
                _speakersAudioReSampler = new MediaFoundationResampler(_speakersAudioBuffer, new WaveFormat(44100, 1)) { ResamplerQuality = 60 };
              
                _mixer.AddMixerInput(_micAudioReSampler);
                _mixer.AddMixerInput(_speakersAudioReSampler);

                MixedWavFile = SaveLocation + User + "_" + CurrentDateTime + "_" + "mixed.wav";
                MixedWmaFile = SaveLocation + User + "_" + CurrentDateTime + "_" + "mixed.wma";
                _mixerOutputConverter = new SampleToWaveProvider16(_mixer);
                
                var cx =  Codec.GetCodecs(MediaTypes.WMMEDIATYPE_Audio);
                if (cx == null)
                {
                    throw new IOException("No Codec Available to convert to WMA file");
                }

                var co = cx.FirstOrDefault(c => c.Name == "Windows Media Audio Voice 9");
                if (co != null)
                {
                    var cf = co.CodecFormats.FirstOrDefault(c => c.Description == "16 kbps, 16 kHz, mono");
                    if (cf == null)
                    {
                        throw new IOException("Could not find 16kbs, 16kHz, mono codec format");
                    }

                    _wmaWriter = new WmaWriter(new FileStream(MixedWmaFile, FileMode.Create), _mixerOutputConverter.WaveFormat, cf);
                    
                }
                else
                {
                    throw new IOException("Could not find Windows Media Audio Voice 9");
                }

                // Assuming the bytes recorded are at 16k / 2 channel
                micBitsPerSecond = _micAudioBuffer.WaveFormat.Channels * _micAudioBuffer.WaveFormat.BitsPerSample * _micAudioBuffer.WaveFormat.SampleRate;
                mixerOutputBitsPerSecond = _mixerOutputConverter.WaveFormat.Channels * _mixerOutputConverter.WaveFormat.BitsPerSample * _mixerOutputConverter.WaveFormat.SampleRate;

                micToMixerRatio = micBitsPerSecond / mixerOutputBitsPerSecond;
                mixedBuffer = new byte[_micAudioBuffer.BufferLength];

                _microphone.StartRecording();
                _speakers.StartRecording();
            }
            else
            {
                throw new ObjectDisposedException("Object has been disposed. Cannot start recording.");
            }
        }

        void StopMixing()
        {
            if (SpeakerAudioRecordingStopped && MicAudioRecordingStopped)
            {
                _mixer.RemoveAllMixerInputs();
                CleanupRecording();
            }
        }

        void _speakersAudio_RecordingStopped(object sender, StoppedEventArgs e)
        {
            SpeakerAudioRecordingStopped = true;
            StopMixing();
        }

        void _microphone_RecordingStopped(object sender, StoppedEventArgs e)
        {
            MicAudioRecordingStopped = true;
            StopMixing();
        }

        void _microphone_DataAvailable(object sender, WaveInEventArgs e)
        {
            _micAudioBuffer.AddSamples(e.Buffer, 0, e.BytesRecorded);
            int mixedBytesRead = _mixerOutputConverter.Read(mixedBuffer, 0, e.BytesRecorded / micToMixerRatio);
            _wmaWriter.Write(mixedBuffer, 0, mixedBytesRead);
        }

        void _speakersAudio_DataAvailable(object sender, WaveInEventArgs e)
        {
            _speakersAudioBuffer.AddSamples(e.Buffer, 0, e.BytesRecorded);

        }

        /// <summary>
        /// This method disposes and closes all the recording objects
        /// and calls the mixer function
        /// </summary>
        public void StopRecording()
        {
            if (!_disposed)
            {
                _microphone.StopRecording();
                _speakers.StopRecording();
            }
            else
            {
                throw new ObjectDisposedException("Object has been disposed. No recording to stop.");
            }
        }

New Post: Problem with playing parts of an audio

$
0
0
Hello, I'm new to NAudio, i'm doing project to be able to play an audio from t1 to t4 and then jump to t7 to t11 without lagging ,
i'm being able to do that if i minimize the latency but that is creating a problem with time the wav start buzzing , here is the code that i'm using;

to determine the position i'm using this
private long GetBitPosition(double time)
    {
        var bitPosition = (long)(WaveFormat.BlockAlign * WaveFormat.SampleRate * time);
        return (bitPosition);

    }
and than in the reading i'm doing the following
public override int Read(byte[] buffer, int offset, int count)
    {
         int bytesRead = 0;

        bytesRead = sourceStream.Read(buffer, offset ,count);
     var upperBound = (Position + count);
     if (upperBound >= _endBitPosition)
        {       
                    if (EnableLooping)
                    {
                        if (Math.Abs(_startBitPosition - sourceStream.Position) > count) 
                            sourceStream.Position = _startBitPosition;

                        if (tmpEnd != null)
                        {
                            EndSample = tmpEnd.Value;
                            OnPlayNext();
                        }
                        else
                            OnPlayLoop();
                    }
                    else
                    {
                        sourceStream.Position = sourceStream.Length;
                        OnPlayEnded();
                    }
        }
this is working fine if my buffer is small but with that i'm getting buzzes after 3 min of playing,
please it will be well appreciated if someone can help me , i used waveout and wasapi and asio out, but no luck

New Post: Unexpected end of stream before frame complete

$
0
0
Hi!

Sorry for my bad english

I use your Mp3Streaming demo and now I get response stream from server, write it to mp3 file and play it from this file at the same time, using
FileStream fileStream = new FileStream("myfile.mp3", FileMode.Open, FileAccess.Read, FileShare.ReadWrite);
                    var readFullyStream = new ReadFullyStream(fileStream);
but about 50% of files from this website throws endofstreamexception Unexpected end of stream before frame complete at the different areas of time.
Can you offer me a solution of this problem, please?

New Post: Calling WmaWriter.Write in data available event causes recording from microphone to stop

$
0
0
I examined the call stack and found an exception is being thrown with this message:

Unable to cast COM object of type 'System.__ComObject' to interface type 'NAudio.WindowsMediaFormat.IWMWriter'. This operation failed because the QueryInterface call on the COM component for the interface with IID '{96406BD4-2B2B-11D3-B36B-00C04F6108FF}' failed due to the following error: No such interface supported (Exception from HRESULT: 0x80004002 (E_NOINTERFACE)).

Do I need to install some Windows Media Audio component to create the COM interface that is missing?

New Post: WaveFileWriter example for the WinRT

$
0
0
Where can I find an example how to work with WaveFileWriter for the WinRT?
How to use this method? Can I save a stream from Mixer?

New Post: wave graph flicking

$
0
0
hello all

I have required wav audio graph for audio player
and i had use naudio dll and wave viewer to display the wav audio graph
and for moving that graph is used startposition and increment in every call and for call i used timer and interval

but is not looking find becouse of flicker.

Give me some solution(idea) so i can move the graph(Audio Wav) smoothly without any flicking in my audio player

New Post: ReadFullyStream analog for FileStream

$
0
0
ReadFullyStream from mp3streaming demo can't work with FileStream, it throws
Unexpected end of stream before frame complete .

Can someone offer me a sample code of ReadFullyStream for FileStream, not ConnectStream for network, that offered in mp3streaming demo?

New Post: Music pitch detection

$
0
0
Anyone know of a good way to detect the pitch of music relative to the A note?

95% of western music (except for led zeppelin and a few others) uses A440hz, but A432hz sounds so much better since 432hz resonates perfectly with the ears and the water molecule.
In case you are interested: http://omega432.com/432-news/testimonials

I've created a small (free) app to detect the pitch of the music playing in Windows, it works fine with piano and similar instruments, simpler music, but it gets confused when there are too many different instruments being played, especially bass frequencies. The pitch detection I am using is https://pitchtracker.codeplex.com/ which uses a modified Auto-correlation algorithm.

FFT on its own is no good, there are many different types of algorithms to detect pitch, but I'm not sure which one would be the best to use, and if there are existing pitch detection libraries that are coded in C#, but using a c/c++ dll would be fine I suppose.

I came across the C based library called Aubio, but before I try it out I would like to gather some guidance or knowledge from you guys.

New Post: The best way to to split audio chain?

$
0
0
First of all - many thanks for NAudio!

Is it possible in NAudio to split the audio signal for processing the input audio data by
separate filters with their own (non-audible) outputs, that consume my application, and finally to send the dry unprocessed signal to the soundcard?

New Post: I'm lost, help appreciate

$
0
0
Hello, I m very lost using this great lib.

I have 4 buttons.
When i click to a button, it plays sound using WaveOut.

I m playing a background sound too using another Wave out.

I m trying to record only sounds produce clicking button (like a piano)

Someone can guides or help me with this issue please ?

New Post: I'm lost, help appreciate

$
0
0
one approach is to use just one WaveOut whose input is a MixingSampleProvider. Then whenever you click the button, you add a new input to that MixingSampleProvider. Then you could use a SavingWaveProvider approach I discuss here to save the audio.
You'd need to make sure not to create a huge file though, and stop writing to WAV once the sounds were complete

New Post: The best way to to split audio chain?

$
0
0
you would create your own IWaveProvider that can pass the audio signal onto some other code doing the DSP as it passes through. The Read method would read from the source IWaveProvider and send it to the DSP code, as well as returning it.

New Post: Music pitch detection

$
0
0
Auto-correlation is one of the better algorithms. It's really hard to detect pitch accurately if lots of different instruments are being played at the same time though.

New Post: ReadFullyStream analog for FileStream

$
0
0
well that just means it got to the end of the file while still expecting more data. That can happen sometimes with MP3 files. You could just catch this exception and ignore it.

New Post: wave graph flicking

$
0
0
what are you using to draw the waveform? If it's WinForms / GDI then make sure you are using double-buffering

New Post: WaveFileWriter example for the WinRT

$
0
0
Someone has kindly contributed a WaveFileWriterRT class, but unfortunately there are no code samples written for this yet. It ought to be similar to using the regular WaveFileWriter class though

New Post: Calling WmaWriter.Write in data available event causes recording from microphone to stop

$
0
0
The WMA codecs are installed by default on most consumer versions of Windows. Are you on a server version? If so there are various Windows extensions called something like "Desktop Media Experience" that you can install

New Post: Problem with playing parts of an audio

$
0
0
make sure your repositions are block aligned (using the BlockAlign property of the WaveFormat)
Viewing all 5831 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>