Quantcast
Channel: NAudio
Viewing all 5831 articles
Browse latest View live

New Post: Play same file multiple times using WaveOutEvent

$
0
0
sounds like a good suggestion. will try to get something like this into the next NAudio

New Post: Any recommendations on integrating SoundTouch?

$
0
0
I have seen the PracticeSharp# code, but curious if anyone has done work by making it into a provider?

Otherwise, I will be playing around with making it into a provider on my own....

Paul

New Post: Any recommendations on integrating SoundTouch?

$
0
0
not that I know of. Let us know how you get on.

New Post: Code example of using FFT on a live audio stream

$
0
0
You can take a look at the NAudioDemo-Project. It contains a live-audio-stream demo. Take a look at the code how the audio gets decoded.
You will find out that there is a BufferedWaveProvider which stores the decoded raw audio and serves it to the playback-callback(hmmm funny word combination :D).

You need to calculate the FFT somewhere between the BufferedWaveProvider and the SoundOut itself because when the audiodata are in the BufferedWaveProvider it won't make any difference whether you playback a file or live-audio-stream. So first implement the buffering and decoding of the audio stream itself. After that you can continue with taking a closer look at the WPF-Demo application. There you can find a sample visualisation using a fft.

New Post: streaming mp3 and fading

$
0
0
Hi! Firstly i want to say big thanks for NAudio, it's very useful tool.
I am trying to get several icecast mp3 streams and show it on progressbar. It works fine, but i have a little problem, after some time my progressbar fade, looks like a frozen. My code is based on Mp3StreamingDemo and i need some help to understand reasons of this situation.

Could you help me to understand this part of code? It writes to my listbox "Buffer getting full, taking a break" many times.
                        if (bufferedWaveProvider != null && bufferedWaveProvider.BufferLength - bufferedWaveProvider.BufferedBytes < bufferedWaveProvider.WaveFormat.AverageBytesPerSecond / 4)
                        {
                            formobj.addtolistbox("Buffer getting full, taking a break ");
                            Thread.Sleep(500);
                        }
Maybe i should insert bufferedWaveProvider.ClearBuffer(); there?

New Post: streaming mp3 and fading

$
0
0
If you call Thread.Sleep on your GUI-Thread your progressbar will freeze because new WindowMessages won't be processed for the next 500 ms.
Don't call this on your GUI-Thread. But I don't know whats your exactly your problem. The bufferedWaveProvider buffers audiodata from your mp3-stream into a buffer(see the name -> bufferedWaveProvider -> buffers wave). If this buffer is getting full you stop buffering for a while so the buffer can get empty. If it is filled with less than 75% of the space you will buffer again.

But you may would have to explain us your problem a bit more in detail.

New Post: Sample Aggregation For WASAPI Loopback

$
0
0
Just had a read of that top article, it seems to be initialised correctly (both values are 0), I have found this article about loopback capture:

http://msdn.microsoft.com/en-gb/library/windows/desktop/dd316551(v=vs.85).aspx

Not exactly sure what the implication is to the current method used regarding this statement:
A pull-mode capture client does not receive any events when a stream is initialized with event-driven buffering and is loopback-enabled. To work around this, initialize a render stream in event-driven mode. Each time the client receives an event for the render stream, it must signal the capture client to run the capture thread that reads the next set of samples from the capture endpoint buffer.
Looking elsewhere on the web, others don't seem to think event-driven loopback capture is possible...

http://blogs.msdn.com/b/matthew_van_eerde/archive/2008/12/16/sample-wasapi-loopback-capture-record-what-you-hear.aspx

Ollie

New Post: Playing Audio File Intermittently With WaveStream

$
0
0
Hello,

Using 2 DirectSoundOut objects I am trying to repeatedly trigger a short wave file(100ms) clip with sinewave stream. I attempted to use a System.Timer and pass a double interval argument to trigger file play. This is all happening via UI slider control. All attempts have wreaked much havoc in terms of threading conflicts. Would it be better to dump the samples from file into a buffer and call from that? It only needs to play once and then pause apx .5 seconds, play again etc.

Thanks for your consideration.

New Post: streaming mp3 and fading

$
0
0
Thanks for reply. The problem is that progressbar doesn't change (keep freeze state) and throws that message ("Buffer getting full, taking a break") in infinite loop. It seems that buffer can't get full. But if i add "bufferedWaveProvider.ClearBuffer();" after "Thread.Sleep(500);" it works ok. I can't understand the reason of that.

P.S. about 10 seconds it works fine but then occurs this problem

New Post: Code example of using FFT on a live audio stream

New Post: streaming mp3 and fading

$
0
0
Are you sure you are playing the buffer?
The princip of that buffering system is quite easy:
On the one end you have webstream and on the other there is your output device.
What you are doing ist just to read mp3-data from the webstream, decode the mp3 data to pcm and write them into a buffer. So the buffer is used by your buffering mechanism to store data in and is also used by your outputdevice like DirectSoundOut to read data from and send the raw data from the buffer to your speaker.

If you won't play the stream you will keep up buffering data into the buffer. But you won't remove any data from the buffer. -> "Buffer getting full".
If you call bufferedWaveProvider.ClearBuffer() you will remove ALL buffered data from your buffer so it won't be full anymore. But that should not be the goal to reach.
The buffer should be cleared by your outputdevice. Your outputdevice reads data and removes the read data. So if it removes the data you won't have to call ClearBuffer(). The 500 ms sleeping is built in if the buffer is getting filled up faster than the outputdevice requests and removes data from it. So you will pause the buffering mechanism for 500 ms by calling Thread.Sleep(500) and your outputdevice will have the chance to remove enough data. If not Sleep(500) is called again until the buffer is just filled up with about 75% of data.

New Post: Network Chat Multi-Client Program

$
0
0
Can I at least show you the pseudocode for my Receiver() method so that I can explain where I'm at with obtaining the data from clients and trying to fill each client's BufferedWaveProvider and adding them to the MixingSampleProvider?

New Post: How do I play a stereo stream across 4 channels?

$
0
0
I am trying to play a stereo stream across 4 channels (Right channel to RF RR and Left channel to LF LR), but only if the current hardware supports it (sound card with at least 4 outputs).

How do I accomplish this?

New Post: Unable to play wav file

$
0
0
Hi,

I have a wav file that was generated using the NAudio demo that records from an ASAPI input.

The file plays back fine in Windows Media Player but when I try to play the file using AudioFileReader I get a "NoDriver calling acmFormatSuggest" exception. The wav format is "extensible" if that is important.

Any ideas. I could send you the (short) wav file if you would like.

Thanks,
Chris

New Post: How do I play a stereo stream across 4 channels?

$
0
0
Exactly for your problem naudio has a class called MultiplexingWaveProvider.

Just use the ConnectInputToOutput to connect your right channel to rf and rr and so on. If you call Read or playback the multiplexingwaveprovider you've created, your channels will be mapped automatically.

New Post: Unable to play wav file

$
0
0
This is because WASAPI uses an annoying type of WaveFormat format called WaveFormatExtensible. But underneath it is PCM (or maybe IEEE float). So what I do is when I create a WaveFileWriter I just recreate a new WaveFormat object with the right parameters.

New Post: streaming mp3 and fading

$
0
0
Thanks, filoe. In my case, i read the buffer, decode to pcm and write into a buffer. Then i take any one pcm point value and show its level on my progressbar (because firstly i need to indicate if that stream is working now). But i have a button on my form which provides oppotunity to listen this stream. So if i press the button my buffer clears by outputdevice and in cycle too. Is it bad decision?

New Post: streaming mp3 and fading

$
0
0
I don't know why do you try to readout the level of a pcm value. This is not necessary. If you can't create a connection to the stream you will get an webexception and if the stream is not a mp3-stream you will get an exception if you try to decode the frames.

So why?

New Post: streaming mp3 and fading

$
0
0
Well, because my program provides monitoring of some radio station, so i need to know that some music plays or if there is a silence right now. Connection could be ok, but with silence or some noise for listeners

New Post: streaming mp3 and fading

$
0
0
Hmmm double post?
Don't check the level of "pcm point value". Work with exceptions to indicate whether the stream works.
"In my case, i read the buffer, decode to pcm and write into a buffer." ist not correct.
"In my case, i read the stream, decode to pcm and write into a buffer." is correct.
Viewing all 5831 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>