Quantcast
Channel: NAudio
Viewing all 5831 articles
Browse latest View live

New Post: WaveIn Valid Sample Rates

$
0
0
Ok thanks Mark.

It's not a big deal. Low values (like 8-bits / 8K sample) make the output hiss a little and lowers quality (as expected), but that's the worst that happens. My USB headset microphone (rated at 16-Bit 44100/48000 Hz) produces very good quality output when I use the rated values. There is no perceivable difference when I use a sample rate higher than 48000 Hz with that mic. So I think I'm in good shape even without being able to query the device Sample Rate and Bit Depth.

I'm getting just a bit of delay between speaking and hearing the output. Just a tad of delay, but still noticeable. I set Latency at 51. If I drop it one to 50, it produces crackling static.
waveOut.DesiredLatency = 51;
Is there any other settings that might help reduce the output delay?

New Post: MidiOut is not working

$
0
0
I want to play MIDI sound out to speaker.

New Post: MidiOut is not working

$
0
0
MIDI doesn't make any sound on its own, it needs to be connected to a synthesizer. What is connected to MIDI out device 0?

New Post: WaveIn Valid Sample Rates

$
0
0
You better don't use that extremly low latency values. I guess you can't really do anything against that little delay. You really would have to play it in the exactly same millisecond as you say it.

New Post: WaveIn Valid Sample Rates

$
0
0
Thanks filoe. I know .Net isn't the best at timing. The Forms Timer is only accurate to 55 milliseconds which seems like a life-time in CPU instructions. But it is what it is.

New Post: WaveIn Valid Sample Rates

$
0
0
it's not .NET, its the waveIn and waveOut APIs. They are not particularly low latency. You need to work with WASAPI or ASIO to go to lower latencies.

New Post: WaveIn Valid Sample Rates

$
0
0
Ok, thanks Mark, good to know. I'll keep that in mind. I want to stick with WaveIn/Out for now because I have XP users. And XP is still 40% of the OS marketshare -- too big to ignore.

Source code checked in, #e1a677786346

$
0
0
improved the decode code in the ACM demo panel

New Post: Stream WAV File

$
0
0
I am trying to figure out how to stream a WAV file. There are examples on stream MP3 files but nothing for a WAV file. I have managed to create a WaveStream and play it back. However, when I try to break it up the first set bytes plays but after that I get noise.

In the example below the ConcurrentQueue will simulate what I want to stream:
// Class member variables:
ConcurrentQueue<byte[]> _cq1 = new ConcurrentQueue<byte[]>();
private static WaveFormat _waveFormat = new WaveFormat(16000, 16, 2);
private static WaveOut _waveOut1 = new WaveOut();
private static BufferedWaveProvider _waveProvider1 =
             new BufferedWaveProvider(_waveFormat);

// Class constructor
public MainWindow()
{
    _waveOut1.Init(_waveProvider1);
    _waveOut1.Play();
}

// These methods are executed on a seperate thread.
WaveStream convertedStream = GetAudio("MyFile.Wav");
AddAudio(convertedStream);


// This method is executed on a seperate thread.
PlayAudio();


 

        private WaveStream GetAudio(string wavFile)
        {
            WaveFileReader reader = new WaveFileReader(wavFile);
            WaveStream convertedStream =
                new WaveFormatConversionStream(_waveFormat, reader);

            return convertedStream;
        }

        private void AddAudio(WaveStream convertedStream)
        {
            int bufferSize = (int)convertedStream.Length / _audioSamples;

            byte[] bytes = new byte[bufferSize];
            int count = bufferSize;
            convertedStream.Position = 0;
            int offset = 0;
            for (; ; )
            {
                count = convertedStream.Read(bytes, offset, bufferSize);
                _cq1.Enqueue(bytes);
                offset += bufferSize;
                if (count == 0)
                {
                    break;
                }
            }
         }

        private void PlayAudio()
        {
            byte[] buffer;
            for (; ; )
            {
                if (_cq1.TryDequeue(out buffer))
                {
                    _waveProvider1.AddSamples(buffer, 0, buffer.Length);
                }

                Thread.Sleep(_sleepMilliSeconds);
            }
        }

New Post: Mixing audio from directshow

New Post: Stream WAV File

New Post: Stream WAV File

$
0
0
I'm not sure what you are trying to do here. Why not just play directly from a WaveFileReader? If you are receiving WAV over the network, then just stick it into the BufferedWaveProvider (and increase the buffer size or block when it gets full). The concurrent queue seems unnecessary.

New Post: Stream WAV File

$
0
0
I am trying to simulate my network traffic before I go off an code that part. I should be able to read bytes from a queue and add them to my WaveProvider continously.

New Post: Stream WAV File

$
0
0
I figured it out. The problem was in the AddAudio method. If I use this method it works:
        private void AddAudio(WaveStream convertedStream)
        {
            int sampleRate = convertedStream.WaveFormat.SampleRate;

            int bytesRead = 0;
            while (convertedStream.Position < convertedStream.Length)
            {
                byte[] bytes = new byte[sampleRate];
                bytesRead = convertedStream.Read(bytes, 0, sampleRate);
                _cq1.Enqueue(bytes);
            }
        }

New Comment on "WAV"

$
0
0
This worked for me to play a wav file: using NAudio.Wave; ..... var soundFile = "Something.wav"; using (var wfr = new WaveFileReader(soundFile)) using (WaveChannel32 wc = new WaveChannel32(wfr) {PadWithZeroes = false}) using (var audioOutput = new DirectSoundOut()) { audioOutput.Init(wc); audioOutput.Play(); while (audioOutput.PlaybackState != PlaybackState.Stopped) { Thread.Sleep(20); } audioOutput.Stop(); } In this case, the PlaybackStopped event was raised, but I guess it's not good to count on that. This would work when running as a service on Windows 7.

New Post: How to user StereoToMonoProvider

$
0
0
I am trying to get SteroToMonoProvider to work. So have I have there working.
private WaveFormat _waveFormat = new WaveFormat(44100, 16, 2);

_waveProvider1 = new BufferedWaveProvider(_waveFormat);
_waveProvider1.BufferDuration = new TimeSpan(_bufferHours,
                                                                       _bufferMinutes,
                                                                       _bufferSeconds);
_waveProvider1.DiscardOnBufferOverflow = true;
_StereoToMono1 = new StereoToMonoProvider16(_waveProvider1);

_waveOut1.Init(_StereoToMono1);
_waveOut1.play();

_StereoToMono1.LeftVolume = 0;
_StereoToMono1.RightVolume = 1;
I am streaming Audio into the BufferedWaveProvider. The wav files I am streaming at 16bit PCM at 44kHz 2 Channel.

So when I set the StereoToMono1.LeftVolume to zero and the RightVolume to 1 the sound does not shift to the right side.

Thanks

New Post: How to user StereoToMonoProvider

$
0
0
that's because it's mono. Mono is the same in both speakers. You want to go mono to stereo instead.

New Post: How to user StereoToMonoProvider

$
0
0
So I switched to use a MonoToStereoProvider class and get the "Source must be Mono"?

New Post: How to user StereoToMonoProvider

$
0
0
yes, MonoToStereo takes mono in and gives stereo out. You want stereo in and stereo out,, but with the channels swapped about a bit from the sounds of it. You'd need to make a custom provider to do that.

New Post: How to user StereoToMonoProvider

$
0
0
Yeah I want to take a stereo input and then only hear the left or the right channel.

Do I need to create a provider to do that?
Viewing all 5831 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>