Unfortunately, you can't use NAudio on Android as it calls Windows APIs for all the sound playing, recording and codecs
↧
New Post: Xamarin + Naudio compile Error.
↧
New Post: AAC encode to stream
That's unfortunate - thanks for the feedback.
↧
↧
New Post: How to create waveform of full audio
I spent some time playing with the project and finally figured out how to create a waveform visualization in WPF using NAudio (it is mostly already done in the WPF demo project if you download this solution).
The problem I'm facing now is that the visualization is created upon actual playback and renders only part of the playing audio at a time. I need to get the waveform of the entire audio without actually playing it. Can anyone help me with this?
Best Regards,
Shujaat
The problem I'm facing now is that the visualization is created upon actual playback and renders only part of the playing audio at a time. I need to get the waveform of the entire audio without actually playing it. Can anyone help me with this?
Best Regards,
Shujaat
↧
New Post: Mixing an input signal and mp3 file
Hi. I'm trying to mixing an input guitar signal and mp3 file. I'm using AsioOut class. Guitar signal should being played at one channel, audio file should being played in both channels.
I have not working code.
How does make it work?
I have not working code.
private static AsioOut asioout;
private static AudioFileReader reader;
private static BufferedWaveProvider bufferedWaveProvider;
private static MixingSampleProvider mixer;
[STAThread]
static void Main(string[] args)
{
asioout = new AsioOut(1);
reader = new AudioFileReader(/*SomeMp3File*/);
bufferedWaveProvider = new BufferedWaveProvider(WaveFormat.CreateIeeeFloatWaveFormat(44100, 2));
mixer = new MixingSampleProvider(WaveFormat.CreateIeeeFloatWaveFormat(44100, 2));
mixer.AddMixerInput(reader.ToSampleProvider());
mixer.AddMixerInput(bufferedWaveProvider);
asioout.AudioAvailable += AsioOutFtAudioAvailable;
asioout.InitRecordAndPlayback(new SampleToWaveProvider(mixer), 2, 0);
asioout.Play();
while (true)
{}
}
private static void AsioOutFtAudioAvailable(object sender, AsioAudioAvailableEventArgs e)
{
byte[] bytes = new byte[e.SamplesPerBuffer * 4];
Marshal.Copy(e.InputBuffers[1], bytes, 0, e.SamplesPerBuffer * 4);
Marshal.Copy(bytes, 0, e.OutputBuffers[1], e.SamplesPerBuffer * 4);
e.WrittenToOutputBuffers = true;
}
When e.WrittenToOutputBuffers is true I hear the guitar signal, but audio file is not being played. When e.WrittenToOutputBuffers is false the only audio file is played.How does make it work?
↧
New Post: How to get what is playing from speakers
I've hooked into this, but the dataavailable event seems to fire constantly with a buffer full of zeros, and bytesRecorded = 19200.
what's going on here?
what's going on here?
↧
↧
New Post: How to get what is playing from speakers
Ok, I just tried it with a sample application and recoginzed, that LoopbackCapture outputs always 32 bit float Samples. Also, with specifing the device in the class constructor it stopped working.
Anyway, here´s my working winforms sample code. Just copy into a VB Winforms app, reference to NAudio and place a Waveformpainter on the Form.
FF
Anyway, here´s my working winforms sample code. Just copy into a VB Winforms app, reference to NAudio and place a Waveformpainter on the Form.
Dim WI As IWaveIn = Nothing
Private Sub FormMain_Load(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Load
WI = New WasapiLoopbackCapture()
AddHandler WI.DataAvailable, AddressOf DataArriving
WI.StartRecording()
End Sub
Private Sub FormMain_FormClosing(ByVal sender As Object, ByVal e As System.Windows.Forms.FormClosingEventArgs) Handles Me.FormClosing
If WI IsNot Nothing Then
WI.StopRecording()
WI.Dispose()
End If
End Sub
Private Sub DataArriving(ByVal sender As Object, ByVal e As WaveInEventArgs)
For index As Integer = 0 To e.BytesRecorded - 1 Step 8
Dim sampleleft As Single = BitConverter.ToSingle(e.Buffer, index)
Dim sampleright As Single = BitConverter.ToSingle(e.Buffer, index + 4)
WaveformPainter1.AddLeftRight(sampleleft, sampleright)
Next
End Sub
The code should display the audio waveform of your windows mixer.FF
↧
New Post: How to create waveform of full audio
In winforms NAudio provides a WaveViewer control that displays a Wavestream.
↧
New Post: How to create waveform of full audio
Thanks a bunch Freefall. I was able to use WaveStream.ToSampleProvider() with a Read() loop to compute the waveform points and then use my custom WPF control (with a Polygon in it) to draw the entire audio waveform in the control.
↧
New Post: Scrolling Waveform WPF C#
Hi All,
I'm a hobbyist coder, learning what I can as I trundle along.
I been trying to code a music player for some time now and keep hitting this impasse and restarting the project through frustration.
I want to display a zoomed in scrolling waveform as an audio file plays. I have tried a few options but only one seems the right way to do it but I can't code it. In my minds eye I need to get a portion of the samples that would be displayed on the screen and draw the lines to show the peaks. I have tried this in a DispatcherTimer and try and grab a number of samples based on the current position in the MediaFoundationReader and the size of the container.
This does do something but the audio drops out every time the timer fires and the image doesn't draw completely. I think it has something to do with the fact I am reading in the whole file every time.
Any help would be very much appreciated.
Here is the timer code.
I'm a hobbyist coder, learning what I can as I trundle along.
I been trying to code a music player for some time now and keep hitting this impasse and restarting the project through frustration.
I want to display a zoomed in scrolling waveform as an audio file plays. I have tried a few options but only one seems the right way to do it but I can't code it. In my minds eye I need to get a portion of the samples that would be displayed on the screen and draw the lines to show the peaks. I have tried this in a DispatcherTimer and try and grab a number of samples based on the current position in the MediaFoundationReader and the size of the container.
This does do something but the audio drops out every time the timer fires and the image doesn't draw completely. I think it has something to do with the fact I am reading in the whole file every time.
Any help would be very much appreciated.
Here is the timer code.
string TrackFileName;
DispatcherTimer timer = new DispatcherTimer();
MediaFoundationReader NewAudioreader;
WaveOut NewWaveOut;
WriteableBitmap FullWaveForm;
WriteableBitmap ZoomWaveForm;
void timer_Tick(object sender, EventArgs e)
{
ZoomWaveForm.Clear();
long samplesRequired = (long)(NewAudioreader.Length / FullWaveForm.Width);
long trackStartPos = NewAudioreader.Position;
//*** I think this is my problem as I read the whole file into the reader.***
using (var reader = new AudioFileReader(TrackFileName))
{
reader.Position = trackStartPos;
var samples = samplesRequired / (reader.WaveFormat.Channels * reader.WaveFormat.BitsPerSample);
var samplesPerPixel = (int)(samplesRequired / ZoomWaveForm.Width);
var max = 0.0f;
var batch = (int)samplesPerPixel;
var mid = ZoomWaveForm.Height / 2;
var yScale = ZoomWaveForm.Height / 2;
float[] buffer = new float[samplesPerPixel];
int read;
var xPos = 0;
while ((read = reader.Read(buffer, 0, batch)) == batch)
{
for (int n = 0; n < read; n = n + 2)
{
max = Math.Max(Math.Abs(buffer[n]), max);
}
ZoomWaveForm.DrawLineAa(xPos, (int)(mid + (max * yScale)), xPos, (int)(mid - (max * yScale)), Colors.Orange, 1);
max = 0;
xPos++;
if (reader.Position >= (NewAudioreader.Position) + samplesRequired)
{
break;
}
}
}
// WPF <Image> container
imgZoomWaveForm.Source = ZoomWaveForm;
}
↧
↧
New Post: Wavwoutevent MUTE and UNMUTE
Hi ,
I have created WaveOutEvent Player. I want to mute and unmute the particular player. I set the volume =(float)0.0; but it will show error "Waveoutevent.volume is obsolete". Please help me to solve this.
Thanks in Advance
Athi Varathan
I have created WaveOutEvent Player. I want to mute and unmute the particular player. I set the volume =(float)0.0; but it will show error "Waveoutevent.volume is obsolete". Please help me to solve this.
Thanks in Advance
Athi Varathan
↧
New Post: Waveoutevent FadeinFadeout
Hi,
I have developed waveoutevent player. Fade in Fade out options how to develope. any one have idea please share with me.
Thank's in Advance
AthiVarathan
I have developed waveoutevent player. Fade in Fade out options how to develope. any one have idea please share with me.
Thank's in Advance
AthiVarathan
↧
Created Unassigned: WaveIn and BufferedWaveProvider ! [16488]
Hi,
I wrote a proxy that must manage a microphone (with WaveIn) and players Wav / PCM (with BufferedWaveProvider).
Everything works fine and I can manage up to 7 wav streams simultaneously) except that when you physically unplug the speakers, my class, based BufferedWaveProvider, remains silent until you restart the computer.
Is this normal behavior ?
Excuse my broken English
Best regards
RGO
I wrote a proxy that must manage a microphone (with WaveIn) and players Wav / PCM (with BufferedWaveProvider).
Everything works fine and I can manage up to 7 wav streams simultaneously) except that when you physically unplug the speakers, my class, based BufferedWaveProvider, remains silent until you restart the computer.
Is this normal behavior ?
Excuse my broken English
Best regards
RGO
↧
New Post: Wavwoutevent MUTE and UNMUTE
You can still use it. In fact, I've taken the obsolete attribute off in the latest version of the code anyway
↧
↧
New Post: StreamLatency of Audio Device is Zero
Hi all,
I've created a big project using NAudio to handle VoiceOutput. So i created a WasapiOut instance for playing my Audiodata to the device. I set a latency Value of 20ms for reading my BufferedVoice Data. So all works fine.
Now quite accidentally I noticed a CPU load over 25%. Now I downladed the NAudio Source Code and added it to my Project. Now I saw that my latency Value I have given to the WasapiOut class is overwriten in the Init method with the StreamLatency Value of the audioClient.
But the StreamLatency in my case is 0. So there will be an endless loop querying the VoiceBuffer for new Data to playout. So I changed the line from
But I would like to know why this happens and if my approach is the correct way to resolve the issue. Did you ever had a StreamLatency of 0 with your devices. (It's a common USB Headset)
Best regards
I've created a big project using NAudio to handle VoiceOutput. So i created a WasapiOut instance for playing my Audiodata to the device. I set a latency Value of 20ms for reading my BufferedVoice Data. So all works fine.
Now quite accidentally I noticed a CPU load over 25%. Now I downladed the NAudio Source Code and added it to my Project. Now I saw that my latency Value I have given to the WasapiOut class is overwriten in the Init method with the StreamLatency Value of the audioClient.
But the StreamLatency in my case is 0. So there will be an endless loop querying the VoiceBuffer for new Data to playout. So I changed the line from
latencyMilliseconds = (int)(audioClient.StreamLatency / 10000);
toif (audioClient.StreamLatency > 0)
latencyMilliseconds = (int)(audioClient.StreamLatency / 10000);
and everything works fine.But I would like to know why this happens and if my approach is the correct way to resolve the issue. Did you ever had a StreamLatency of 0 with your devices. (It's a common USB Headset)
Best regards
↧
New Post: StreamLatency of Audio Device is Zero
Sorry, you can close it. I didn't noticed that this have been changed already on github.
Thanks.
Thanks.
↧
Commented Unassigned: WaveIn and BufferedWaveProvider ! [16488]
Hi,
I wrote a proxy that must manage a microphone (with WaveIn) and players Wav / PCM (with BufferedWaveProvider).
Everything works fine and I can manage up to 7 wav streams simultaneously) except that when you physically unplug the speakers, my class, based BufferedWaveProvider, remains silent until you restart the computer.
Is this normal behavior ?
Excuse my broken English
Best regards
RGO
Comments: well are you getting any RecordingStopped or PlaybackStopped events? You might need to recreate waveout devices.
I wrote a proxy that must manage a microphone (with WaveIn) and players Wav / PCM (with BufferedWaveProvider).
Everything works fine and I can manage up to 7 wav streams simultaneously) except that when you physically unplug the speakers, my class, based BufferedWaveProvider, remains silent until you restart the computer.
Is this normal behavior ?
Excuse my broken English
Best regards
RGO
Comments: well are you getting any RecordingStopped or PlaybackStopped events? You might need to recreate waveout devices.
↧
Commented Unassigned: DirectSoundOut fails if WaveFormat is WaveFormatExtraData [16487]
If the WaveFormat of the wave provider used as the parameter for Init is of type WaveFormatExtraData
The call within InitializeDirectSound to GCHandle.Alloc fails with an ArgumentException complaining that the “Object contains non-primitive or non-blittable data.”.
This is due to the extraData array of bytes field. Although it is a fixed size, alloc is still unable to pin the array because the array type itself is non-primitive and so cannot be pinned.
public TestPlay(Stream in)
{
var waveStream = new WaveFileReader(in);
var player = new DirectSoundOut(DirectSoundOut.DSDEVID_DefaultPlayback);
player.Init(waveStream);
player.Play();
//Waiting for play to stop removed for brevity.
}
Comments: There appears to have been some work done on this. https://naudio.codeplex.com/SourceControl/network/forks/JosephEoff/ExtraDataFix/contribution/7886
The call within InitializeDirectSound to GCHandle.Alloc fails with an ArgumentException complaining that the “Object contains non-primitive or non-blittable data.”.
This is due to the extraData array of bytes field. Although it is a fixed size, alloc is still unable to pin the array because the array type itself is non-primitive and so cannot be pinned.
public TestPlay(Stream in)
{
var waveStream = new WaveFileReader(in);
var player = new DirectSoundOut(DirectSoundOut.DSDEVID_DefaultPlayback);
player.Init(waveStream);
player.Play();
//Waiting for play to stop removed for brevity.
}
Comments: There appears to have been some work done on this. https://naudio.codeplex.com/SourceControl/network/forks/JosephEoff/ExtraDataFix/contribution/7886
↧
↧
New Post: Weird slowing down of audio, when i use Mp3StreamingDemo
Sorry for my bad english.
When i am using NAudioDemo -> Mp3StreamingDemo and want to play some music from
TEXT
for example, i get a weird audio slowing down. What the matter of this?
(I ve tried to use DMO frame decompressor and I ve got "System.NullReferenceException" in NAudio.dll) exception)
About 95% audios from same website is playing successfully (TEXT, for example). But with another 5% I get a 50% speed, if i use Acm frame decompressor
When i am using NAudioDemo -> Mp3StreamingDemo and want to play some music from
TEXT
for example, i get a weird audio slowing down. What the matter of this?
(I ve tried to use DMO frame decompressor and I ve got "System.NullReferenceException" in NAudio.dll) exception)
About 95% audios from same website is playing successfully (TEXT, for example). But with another 5% I get a 50% speed, if i use Acm frame decompressor
↧
New Post: Weird slowing down of audio, when i use Mp3StreamingDemo
Ok, I found the problem, but still can't solve it.
TEXT
this song has 44100 Hz sample rate, but frame.SampleRate is 32000 Hz.
How can i solve this problem?
Also Dmo decompressor started work when I set argument sampleRate of Mp3WaveFormat constructor in true sample rate value of song manually
private static IMp3FrameDecompressor CreateFrameDecompressor(Mp3Frame frame)
{
WaveFormat waveFormat = new Mp3WaveFormat(frame.SampleRate, frame.ChannelMode == ChannelMode.Mono ? 1 : 2,
frame.FrameLength, frame.BitRate);
return new DmoMp3FrameDecompressor(waveFormat);
}
frame has a wrong SampleRate property in some audios. TEXT
this song has 44100 Hz sample rate, but frame.SampleRate is 32000 Hz.
How can i solve this problem?
Also Dmo decompressor started work when I set argument sampleRate of Mp3WaveFormat constructor in true sample rate value of song manually
↧
New Post: Weird slowing down of audio, when i use Mp3StreamingDemo
I think this is caused by junk (album art?) at the start of MP3 files getting mis-recognised as valid MP3 frames. One way to work around this is to keep reading frames until you've had a certain number in a row with the same format, and ignoring any before that.
↧