Download the source code and build it yourself. Or if you use NuGet, you can get a pre-release build from nuget.
↧
New Post: How to use NAudio with video file !
↧
New Post: How to use NAudio with video file !
I don't see where the source code of MediaFoundationReader or 1.7.
I was so upset about this. I tried to understand NAudio all day.
Please help me !
I was so upset about this. I tried to understand NAudio all day.
Please help me !
↧
↧
New Post: FFT results
Hello, I am new to NAudio and am playing with the NAudioWpfDemo FFT code and I am trying to understand the FFT results. I am using the demo's SampleAggregator.cs and AudioPlayback.cs with an _fftlength of 16.
What I would like to do is display a bar graph, one for each frequency bin, showing the amplitude of each. But those numbers don't seem to be inline with my expectations.
Can some explain how I can take these values and display a reasonable graph? Or, are these values even the expected results from a musical song?
Oh, and what does the aggregator.NotificationCount = 882 mean? What is a NotificationCount?
Any insight would be greatly appreciated!
Thank you,
Dan
public AudioPlayback()
{
_fftLength = 16;
aggregator = new SampleAggregator(_fftLength);
aggregator.NotificationCount = 882;
aggregator.PerformFFT = true;
}
Here's my form's callback method which is being called:void audioGraph_FftCalculated(object sender, FftEventArgs e)
{
int qtyBins = audioPlayback.FFTLength / 2; // This equals 8
for (int j = 0; j < qtyBins; j++)
{
// audioPlayback.SampleRate = 44100
double freq = j * (audioPlayback.SampleRate / audioPlayback.FFTLength);
double dx = e.Result[j].X; double dy = e.Result[j].Y;
Debug.WriteLine(j + " " + freq/1000 + "kHz = " + dx + " " + dy);
}
}
And here's some of the callback's output:bin freq result.X result.Y
0 0kHz = 0.0895436406135559 0
1 2.756kHz = -0.0815449208021164 -0.072540670633316
2 5.512kHz = 0.0273693520575762 0.0248471423983574
3 8.268kHz = 0.000522108399309218 -0.00102939654607326
4 11.024kHz = -0.00428497046232224 -0.00206405459903181
5 13.78kHz = -0.0028145220130682 0.00431088916957378
6 16.536kHz = 0.00255107693374157 -0.00238887779414654
7 19.292kHz = 0.000449806451797485 1.36718153953552E-05
So, as I understand these results, I have 8 bins, and each bin relates to a (average?) frequency. But what are the X and Y components? I am assuming the X component is the average amplitude (energy?) of the frequency, but I have no idea what the Y represents?What I would like to do is display a bar graph, one for each frequency bin, showing the amplitude of each. But those numbers don't seem to be inline with my expectations.
Can some explain how I can take these values and display a reasonable graph? Or, are these values even the expected results from a musical song?
Oh, and what does the aggregator.NotificationCount = 882 mean? What is a NotificationCount?
Any insight would be greatly appreciated!
Thank you,
Dan
↧
Commented Unassigned: Problem with WavOut.Eventhandler [16393]
Hi
I have been trying to use the, playbackstopped eventhandler on the wavout class. I am trying to get a playlist to cycle through audio tracks so at the end of each track it evaluates the playlist and plays the next valid track. For some reason NAudio crashes when I do this at the .init() call I have had a look at the PlaybackStatus and it seems to still be at Playing when the playbackstopped event is or has been called.
Any suggestions....
if you want my project off me please give me a shout moogus@hotmail.co.uk
Darren
Comments: Hi Thanks for the response. I was just using the playbackstopped event to start another track. The action being passed in is the playlist initiating the next track. I traced it through the sample code version and when the playbackstopped event fires the state is not stopped could this be the why? Any advice would be cool. Darren private static IWavePlayer _waveOutDevice; private static WaveStream _mainOutputStream; public NAudioPlayFile() { _waveOutDevice = new WaveOut(); } public void OnTrackStopped(Action action) { _waveOutDevice.PlaybackStopped += (sender, args) => action(); } public void InitializeAudioTrack(string source) { _mainOutputStream = new AudioFileReader(source); } public PlaybackState PlaybackState { get { return _waveOutDevice.PlaybackState; } } public void PlayFile() { _waveOutDevice.Init(_mainOutputStream); _waveOutDevice.Play(); } public void CancelTrack() { _waveOutDevice.Stop(); _waveOutDevice.Dispose(); }
I have been trying to use the, playbackstopped eventhandler on the wavout class. I am trying to get a playlist to cycle through audio tracks so at the end of each track it evaluates the playlist and plays the next valid track. For some reason NAudio crashes when I do this at the .init() call I have had a look at the PlaybackStatus and it seems to still be at Playing when the playbackstopped event is or has been called.
Any suggestions....
if you want my project off me please give me a shout moogus@hotmail.co.uk
Darren
Comments: Hi Thanks for the response. I was just using the playbackstopped event to start another track. The action being passed in is the playlist initiating the next track. I traced it through the sample code version and when the playbackstopped event fires the state is not stopped could this be the why? Any advice would be cool. Darren private static IWavePlayer _waveOutDevice; private static WaveStream _mainOutputStream; public NAudioPlayFile() { _waveOutDevice = new WaveOut(); } public void OnTrackStopped(Action action) { _waveOutDevice.PlaybackStopped += (sender, args) => action(); } public void InitializeAudioTrack(string source) { _mainOutputStream = new AudioFileReader(source); } public PlaybackState PlaybackState { get { return _waveOutDevice.PlaybackState; } } public void PlayFile() { _waveOutDevice.Init(_mainOutputStream); _waveOutDevice.Play(); } public void CancelTrack() { _waveOutDevice.Stop(); _waveOutDevice.Dispose(); }
↧
New Post: Netwok stream of loopback recording
hi am trying to stream the WasapiLoopbackCapture ..
but the proble is that the WasapiLoopbackCapture writes a wav file of 32bits per sample wich VLC netwok stream reader cannot read .
but the proble is that the WasapiLoopbackCapture writes a wav file of 32bits per sample wich VLC netwok stream reader cannot read .
waveIn = new WasapiLoopbackCapture();
waveIn.DataAvailable += OnDataAvailable;
waveIn.StartRecording();
void OnDataAvailable(object sender, WaveInEventArgs e)
{
m_MulticastSender.SendBytes(ToRTPPacket(e.Buffer,waveIn.WaveFormat.BitsPerSample,waveIn.WaveFormat.Channels).ToBytes());
}
↧
↧
New Post: Netwok stream of loopback recording
convert it to 16 bit before you send it. Is it 32 bit int or 32 bit float you are capturing. The helper methods on the BitConverter will help you make the conversion,.
↧
New Post: How to use NAudio with video file !
look above at the "Source Code" tab
↧
New Post: Wav file amplitude a a given time interval
WaveFileReader lets you examine each sample individually. For "amplitude", often you will actually look for the maximum sample value over a short period.
↧
New Post: Netwok stream of loopback recording
yes ... but how to convert it ?
↧
↧
New Post: Which way is best for sampling both left and right channels?
Quick question...
Which way is best for sampling both left and right channels?
Dan
Which way is best for sampling both left and right channels?
private ISampleProvider CreateInputStream(string fileName)
{
fileStream = OpenWavStream(fileName);
var inputStream = new SampleChannel(fileStream, true);
var sampleStream = new NotifyingSampleProvider(inputStream);
/// this way?
sampleStream.Sample += (s, e) => aggregator.Add(e.Left);
sampleStream.Sample += (s, e) => aggregator.Add(e.Right);
OR
/// this way?
sampleStream.Sample += (s, e) => aggregator.Add(e.Left + e.Right);
return sampleStream;
}
Thank you!Dan
↧
New Post: Windows 8 resampling
Hi Mark!
I'm porting application from Windows Phone. I was able to record PCM with your outstanding library, but I do not what classes should I use to convert recorded 44KHz, 32 bit sound to 18Khz, 16 bit sound. Do you have any sample that explains how can I do that?
Thanks.
I'm porting application from Windows Phone. I was able to record PCM with your outstanding library, but I do not what classes should I use to convert recorded 44KHz, 32 bit sound to 18Khz, 16 bit sound. Do you have any sample that explains how can I do that?
Thanks.
↧
New Post: Windows 8 resampling
please read my article here:
http://www.codeproject.com/Articles/501521/How-to-convert-between-most-audio-formats-in-NET
It explains the various options for bit depth changing and resampling (which you'd probably need to do as two separate steps, although the Media Foundation Resampler might be able to do it in one)
http://www.codeproject.com/Articles/501521/How-to-convert-between-most-audio-formats-in-NET
It explains the various options for bit depth changing and resampling (which you'd probably need to do as two separate steps, although the Media Foundation Resampler might be able to do it in one)
↧
New Post: Which way is best for sampling both left and right channels?
the first way if you were drawing separate waveforms for left and right
↧
↧
New Post: Using WasapiLoopbackCapture to do a sound visualizer
hello
i'm trying to do a visualizer using Naudio wasapi loop back capture feature, but i have a problem using samples.
NAudio.Wave.WasapiLoopbackCapture wasapi = new NAudio.Wave.WasapiLoopbackCapture();
wasapi.StartRecording();
wasapi.DataAvailable += InputBufferToFileCallback;
Now i use the event InputBufferToFileCallback to get the e.buffer and then extracting from the byte array the samples
thanks
i'm trying to do a visualizer using Naudio wasapi loop back capture feature, but i have a problem using samples.
NAudio.Wave.WasapiLoopbackCapture wasapi = new NAudio.Wave.WasapiLoopbackCapture();
wasapi.StartRecording();
wasapi.DataAvailable += InputBufferToFileCallback;
Now i use the event InputBufferToFileCallback to get the e.buffer and then extracting from the byte array the samples
byte[] buffer = e.Buffer;
int bytesRecorded = e.BytesRecorded;
int bufferIncrement = (int)(wasapi.WaveFormat.BlockAlign / wasapi.WaveFormat.Channels);
int bitsPerSample = wasapi.WaveFormat.BitsPerSample;
for (int index = 0; index < e.BytesRecorded; index += bufferIncrement)
{
float sample32left = 0;
float sample32right = 0;
sample32right = BitConverter.ToSingle(buffer, index);
sample32left = BitConverter.ToSingle(buffer, index + 4);
sampleAggregator.Add(sample32left, sample32right);
}
My question is if the samples for the right and left channels are right or if is it possible to use the wavechannel32 class to get the left and right samples in relation to the wasapiloopbackCapture.thanks
↧
New Post: Which way is best for sampling both left and right channels?
markheath wrote:
the first way if you were drawing separate waveforms for left and rightThank you. Would the second way be appropriate for "mixing" the two channels if I was drawing one waveform?
↧
New Post: Increase only low bit section of a song
hi,
i have a problem on a project.I want to only Increase Low bit section Of a track.High bit section are same.i follow and use naudio project. I want particular portion of code.
i have a problem on a project.I want to only Increase Low bit section Of a track.High bit section are same.i follow and use naudio project. I want particular portion of code.
↧
New Post: Naudio Playing problem
where playing file there is a stuck when minimize the window or maximize the window.
Regards,
Hinshin
Regards,
Hinshin
↧
↧
New Post: From WaveIn To SampleProvider to WaveOut
Hello,
Could you provide a skeletton (C#) for :
Thanks,
Seb
Could you provide a skeletton (C#) for :
- Wave In to sampleprovider chain to wave out
- Filereader to sampleprovider chain to waveout
-
wavein to sampleprovider chain to FileWriter.
Thanks,
Seb
↧
New Post: From WaveIn To SampleProvider to WaveOut
if you want to pipe waveIn to an output, use BufferedWaveProvider.
For your second example, AudioFileReader does exactly what you are asking. Look at the code to see how it does it
To a filewriter - you cold use WaveFileWriter.CreateWaveFile (and CreateWaveFile16 which takes a sampleprovider)
There are several classes in NAudio that go from WaveProvider to SampleProvider and vice versa. Look at Pcm16ToSampleProvider, SampleToWaveProvider, SampleToWaveProvider16. For NAudio 1.7 I am planning to include some extension methods on both IWaveProvider, ISampleProvider and WaveOut which will simplify switching between the two audio stream representations.
For your second example, AudioFileReader does exactly what you are asking. Look at the code to see how it does it
To a filewriter - you cold use WaveFileWriter.CreateWaveFile (and CreateWaveFile16 which takes a sampleprovider)
There are several classes in NAudio that go from WaveProvider to SampleProvider and vice versa. Look at Pcm16ToSampleProvider, SampleToWaveProvider, SampleToWaveProvider16. For NAudio 1.7 I am planning to include some extension methods on both IWaveProvider, ISampleProvider and WaveOut which will simplify switching between the two audio stream representations.
↧
New Post: How to get maximum DB value of a section in a mp3 song?
hi
I use naudio project as a reference. I want to How to get maximum DB value of a section in a mp3 song. pls help me.
I use naudio project as a reference. I want to How to get maximum DB value of a section in a mp3 song. pls help me.
↧