Quantcast
Channel: NAudio
Viewing all 5831 articles
Browse latest View live

New Post: Using lame.exe on WaveIn

$
0
0
Do you mean in two steps?
1) record wave to disk and close
2) send the the finished wave file to lame

I have done that and it works fine.

New Post: Using lame.exe on WaveIn

$
0
0
then something is wrong with the lame command line parameters.

New Post: Stream player question

$
0
0
No, I'm afraid NAudio does not provide this capability

New Post: Using lame.exe on WaveIn

$
0
0
Okay I'll play around with them. I started this using your EncodeToMp3UsingStdIn method from your pluralsight course with the same results. I think I'll back up to there and try again. Thanks

New Post: Stream player question

New Post: Stream player question

$
0
0
Is there an Naudio API that provides ability to play web streams? (ie Http://...)
I'm currently using WaveOut and IWavePlayer, which works great by it won't accept an Http URI.

PS I don't need the stream information that MagratG asks about. I just want to be able to play the stream.

New Post: Stream player question

New Post: Stream player question


New Post: MixingSampleProvider to Output

$
0
0
Hello! I'm new to this subject, so please forgive if my question sounds silly, i want to mix multiple Inputs using MixingSampleProvider and get it played. How can i use the MixingSampleProvider to get to an output like WaveOut or DirectSoundOut?

And is it possible to add/replace/delete the Mixer Inputs while playing?

New Post: BufferedWaveProvider Buffer Full

$
0
0
Hi all,

I am receiving some audio data via http and now want to play the data received. But I am always getting an exception which says that the buffer is full. I know that the problem is normally caused by not playing the buffer, but I am playing it...
And with the data receiving everything should be correct (I checked that several times), so the problem has to be somewhere in the part that cares about playing.

This is my code:
class Program
{
    static BufferedWaveProvider bufferedWaveProvider;
    static WaveOut player;
    static WebClient client = new WebClient();

    static void Main(string[] args)
    {
        bufferedWaveProvider = new BufferedWaveProvider(new WaveFormat(48000, 2) { });
        player = new WaveOut();
        player.Init(bufferedWaveProvider);
        player.Play();

        while(true)
        {
             using (client)
             {
                 byte[] response = 
                 client.UploadValues("http://localhost:8080/test", new NameValueCollection());

                 bufferedWaveProvider.AddSamples(response, 0, response.GetLength(0));

                 System.Threading.Thread.Sleep(1);
            }
        }
    }
}
Due make it better for reading, I left out some code that is responsible for checking whether the received data is already stored in the WaveProvider. Only when something new is coming, it is written to the Wave Provider. That means the exception is not caused by the endless-loop running too fast for the playback.

Anyone knows what is going wrong?

New Post: Weird issue with Waveprovider16

$
0
0
I'm using WasapiOut with a class derived from WaveProvider16, since I'm getting streamed data in from the network and playing it out.

Problem is, when my Read routine gets called, what is 'supposed' to be a short[] buffer is actually a byte[] buffer. I was trying to use Array.Copy to slam some PCM 16 data into the buffer, but it throws an exception due to the incompatible types.

Any idea how to get an actual short[] buffer passed to my Read routine?

New Post: BufferedWaveProvider Buffer Full

$
0
0
I have found the problem now.

The problem was the WaveFormat used to create the bufferd wave provider. I only checkd that sample rate and channels are the same because I did not know that there is a difference between PCM and Ieee. So I changed that and everything is fine :)

New Post: BufferedWaveProvider Buffer Full

$
0
0
great, glad you found the problem

New Post: Preparing audio for using HTML 5 audio element

$
0
0
Hi!

I'm currently having 2 WPF applications.
The first one reads data from my asio soundcard, does some adjustments (volume, mixing, ...) and in the end sends a 16bit 44.1kHz PCM stream over the network if a client is connected.
The second application is this client. It gets the stream over TCP and just plays the data coming in.

This works pretty well and it's very fast.

Now I want to replace my 2nd WPF application with a HTML site. What is the best way to do this?

As I don't use any protocol, do I need to wrap my PCM stream into something?
I know that different browsers need different audio formats to be able to play these.

It is important to keep my almost zero delay and the good audio quality I have right now.

For now this solution has to work in a LAN, so the network shouldn't be a bottleneck.

Kind regards,
Stefan

New Post: Seting WaveOut buffersize directly to a number of samples

$
0
0
Hey Mark, thanks this is a good idea. I´ll try that :)

New Post: Click Winforms controlbox pauses sound out

New Post: Input Driven Resampling Wasapi Loopback

$
0
0
Hi,

I am recording some audio data using wasapi loopback. And now I want to do input driven resampling (resampling before writing it into file) in order to change sample rate and bit depth.
I know that there is an example on the internet using ACM Resampler, but as wasapi loopback is producing 32bit IeeeFloat samples I cannot use ACM (as far as I know).
Does anyone know how to do this using the WDL resampler or has a code example for me? I have really no idea about that and I can't find any examples...

Tha aim is to reduce the amount of data. I think for this purpose it would be even better to do a real-time mp3-conversion, but I am not sure whether this is possible in general. When you have some ideas or code snippets about that, would be very nice :)

I would be very grateful for any ideas helping me to implement one of both

New Post: OnStopRecording error,

$
0
0
It seems that when the software freezes or I stop the software with a breakpoint I get this message. Also it seems to be coming from windows, because it is in the language of the windows.

New Post: Input Driven Resampling Wasapi Loopback

$
0
0
Now I am trying to do it with the Media Foundation Resampler (although I am not sure whether this is the "rigth one" :D). I am trying this one:
byte[] converted = new byte[38400];
BufferedWaveProvider rs = new BufferedWaveProvider(loopbackRecorder.WaveFormat);
MediaFoundationResampler mfr = new MediaFoundationResampler(rs, new WaveFormat(44100,32,2));

public void waveIn_DataAvailable(object sender, WaveInEventArgs e)
{
    // ...
    rs.AddSamples(e.Buffer, 0, e.BytesRecorded);
    int count = mfr.Read(converted, 0, rs.BufferedBytes);
    //...
}
When I am debugging this, it seems to run correctly when data is available the first time. Then something is written in to the converted-Array and the BufferedWaveProvider is cleared because it was read.
But the next times, the samples are stored in the BufferedWaveProvider but are not read out. And nothing is written into my converted-Array. Finally, this causes a crash because there is soon to much stored in the buffer, more bytes than my converted-Array is long.

I am not sure whether this is a good approach and I have no clue why this seems to run only the first time - the code is obviously the same for each time :P

New Post: Weird issue with Waveprovider16

$
0
0
The reason you are seeing this is because NAudio uses a trick under the covers to prevent unnecessary coping of memory. This is the WaveBuffer class, and uses unions to make a byte[] look like its a short[]. Unfortunately, Array.Copy uses reflection under the hood and gets confused. I think you'll find Buffer.BlockCopy will do the trick instead, but remember to pass the count parameter as a number of bytes.
Viewing all 5831 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>