implement in pause method if not is playing when pause needed to call play method and sleep thread for some milliseconds before pause.
↧
Created Unassigned: pause before play [16483]
↧
Edited Unassigned: pause before play [16483]
implement in pause method if not is playing when pause needed to call play method and sleep thread for some milliseconds before pause in directsound
↧
↧
Created Unassigned: MixingSampleProvider+Waveout does not work [16484]
Hello,
i am using the MixingSampleProvider for multiple Inputs.
```
BufferedWaveProvider dummy = new BufferedWaveProvider(new WaveFormat(48000, 16, 1));
MixingSampleProvider ss = new MixingSampleProvider(WaveFormat.CreateIeeeFloatWaveFormat(48000, 1));
ss.AddMixerInput(dummy);
WaveOut waveOut = new WaveOut();
waveOut.Init(ss);
waveOut.Play();
```
undefinied error while waveoutopen.interop
```
I think the floatWaveFormat is the reason, the same Code above works fine without the MixingSampleProvider.
waveOut.Init(dummy);
waveOut.Play();
```
But when i use DirectSound as waveOut both Codes are ist working fine.
Why?
i am using the MixingSampleProvider for multiple Inputs.
```
BufferedWaveProvider dummy = new BufferedWaveProvider(new WaveFormat(48000, 16, 1));
MixingSampleProvider ss = new MixingSampleProvider(WaveFormat.CreateIeeeFloatWaveFormat(48000, 1));
ss.AddMixerInput(dummy);
WaveOut waveOut = new WaveOut();
waveOut.Init(ss);
waveOut.Play();
```
undefinied error while waveoutopen.interop
```
I think the floatWaveFormat is the reason, the same Code above works fine without the MixingSampleProvider.
waveOut.Init(dummy);
waveOut.Play();
```
But when i use DirectSound as waveOut both Codes are ist working fine.
Why?
↧
New Post: NAudio WaveOutEvent Player
Hi Team,
I have developed WaveOutEvent Player. In this player i have implemented "FadeInOut" and "Volume meter" options. Both are working fine,But the problem is can't work the same time FadeInout/Volumemeter. Only one will work at the same time.
EX:
FadeInOut--> Player.Init(new SampleToWaveProvider(fadeInOut));
Volume meter--> Player.Init(new SampleToWaveProvider(postVolumeMeter));
Please help me to do this both in a single player.
Thanks
Athivarathan.S
I have developed WaveOutEvent Player. In this player i have implemented "FadeInOut" and "Volume meter" options. Both are working fine,But the problem is can't work the same time FadeInout/Volumemeter. Only one will work at the same time.
EX:
FadeInOut--> Player.Init(new SampleToWaveProvider(fadeInOut));
Volume meter--> Player.Init(new SampleToWaveProvider(postVolumeMeter));
Please help me to do this both in a single player.
Thanks
Athivarathan.S
↧
New Post: Waveoutevent player Fade In/Out
Thank you for your replay, I got the solution. But the small problem FadeInOut working fine but volume meter option not working. Only one will work at the same time.
EX :
Player.Init(new SampleToWaveProvider(fadeInOut)); // FadeInOut works
Player.Init(new SampleToWaveProvider(postVolumeMeter)); // volume meter works
How to work both in a single player.I want to use FadeInOut and volume meter .Please do for need full.
Thank You.
EX :
Player.Init(new SampleToWaveProvider(fadeInOut)); // FadeInOut works
Player.Init(new SampleToWaveProvider(postVolumeMeter)); // volume meter works
How to work both in a single player.I want to use FadeInOut and volume meter .Please do for need full.
Thank You.
↧
↧
Commented Unassigned: MixingSampleProvider+Waveout does not work [16484]
Hello,
i am using the MixingSampleProvider for multiple Inputs.
```
BufferedWaveProvider dummy = new BufferedWaveProvider(new WaveFormat(48000, 16, 1));
MixingSampleProvider ss = new MixingSampleProvider(WaveFormat.CreateIeeeFloatWaveFormat(48000, 1));
ss.AddMixerInput(dummy);
WaveOut waveOut = new WaveOut();
waveOut.Init(ss);
waveOut.Play();
```
undefinied error while waveoutopen.interop
```
I think the floatWaveFormat is the reason, the same Code above works fine without the MixingSampleProvider.
waveOut.Init(dummy);
waveOut.Play();
```
But when i use DirectSound as waveOut both Codes are ist working fine.
Why?
Comments: UPDATE: When changed the SampleRate to 32000, then it works fine. But i Need 48000 as SampleRate. Waveout.Init() does not accept 48000 as Floating eee Format.
i am using the MixingSampleProvider for multiple Inputs.
```
BufferedWaveProvider dummy = new BufferedWaveProvider(new WaveFormat(48000, 16, 1));
MixingSampleProvider ss = new MixingSampleProvider(WaveFormat.CreateIeeeFloatWaveFormat(48000, 1));
ss.AddMixerInput(dummy);
WaveOut waveOut = new WaveOut();
waveOut.Init(ss);
waveOut.Play();
```
undefinied error while waveoutopen.interop
```
I think the floatWaveFormat is the reason, the same Code above works fine without the MixingSampleProvider.
waveOut.Init(dummy);
waveOut.Play();
```
But when i use DirectSound as waveOut both Codes are ist working fine.
Why?
Comments: UPDATE: When changed the SampleRate to 32000, then it works fine. But i Need 48000 as SampleRate. Waveout.Init() does not accept 48000 as Floating eee Format.
↧
New Post: Waveoutevent player Fade In/Out
You need to build a signal chain like that:
Dim Reader As New Mp3FileReader("")
Dim FadeInOut = New FadeInOutSampleProvider(Reader.ToSampleProvider)
Dim PostVolume = New VolumeSampleProvider(FadeInOut)
Dim WO As New WaveOut()
WO.Init(PostVolume)
WO.Play()
Greetz.↧
New Post: NAudio WaveOutEvent Player
Stop creating a new thread with same question. I already answered this in your last thread.
Spamming is NOT the way to solve things.
Spamming is NOT the way to solve things.
↧
New Post: WaveOut Volume (Network-Chat)
Use VolumeSampleProvider. If you need to push volume > 100% you should use a compressor effect.
↧
↧
New Post: Waveoutevent player Fade In/Out
Thank you sir, But we are using Wav file in C# project. So, I can't use VolumeSampleProvider
Here my code:
file = new AudioFileReader(fileName);
fadeInOut = new FadeInOutSampleProvider(file);
var waveChannel = new SampleChannel(file, true);
Player.Init(new SampleToWaveProvider(postVolumeMeter));
Player.Play();
I want to use MeteringSampleProvider and VolumeSampleProvider. Please help me to do this .
Thank You.
Here my code:
file = new AudioFileReader(fileName);
fadeInOut = new FadeInOutSampleProvider(file);
var waveChannel = new SampleChannel(file, true);
var postVolumeMeter = new MeteringSampleProvider(waveChannel);
var postVolumeMeter = new MeteringSampleProvider(__fadeInOut__); /// "here i cant add VolumeSampleProvider "
postVolumeMeter.StreamVolume += OnPostVolumeMeter_Stack;Player.Init(new SampleToWaveProvider(postVolumeMeter));
Player.Play();
I want to use MeteringSampleProvider and VolumeSampleProvider. Please help me to do this .
Thank You.
↧
New Post: Waveoutevent player Fade In/Out
Seems, you have no clue what "signal chain" means... Anyway, change:
Greetz.
PS: Please mind a correct presentation and pronounciation in future, either nobody will want to help you.
var waveChannel = new SampleChannel(file, true);
to:var waveChannel = new SampleChannel(fadeInOut, true);
And it should work. Btw, signal chain means, that all classes are connected by referring to the class before. The data is then pulled through all of them when the audio device requests a new block.Greetz.
PS: Please mind a correct presentation and pronounciation in future, either nobody will want to help you.
↧
New Post: Bind NAudio classes.
Hello,
is it regular allowed to cut some classes from NAudio to my own Project?
is it regular allowed to cut some classes from NAudio to my own Project?
↧
New Post: Bind NAudio classes.
Hello,
is it regular allowed to cut some classes from NAudio to my own Project?
is it regular allowed to cut some classes from NAudio to my own Project?
↧
↧
Commented Unassigned: MixingSampleProvider+Waveout does not work [16484]
Hello,
i am using the MixingSampleProvider for multiple Inputs.
```
BufferedWaveProvider dummy = new BufferedWaveProvider(new WaveFormat(48000, 16, 1));
MixingSampleProvider ss = new MixingSampleProvider(WaveFormat.CreateIeeeFloatWaveFormat(48000, 1));
ss.AddMixerInput(dummy);
WaveOut waveOut = new WaveOut();
waveOut.Init(ss);
waveOut.Play();
```
undefinied error while waveoutopen.interop
```
I think the floatWaveFormat is the reason, the same Code above works fine without the MixingSampleProvider.
waveOut.Init(dummy);
waveOut.Play();
```
But when i use DirectSound as waveOut both Codes are ist working fine.
Why?
Comments: Hello, something are wrong with the waveformat: public static WaveFormat CreateIeeeFloatWaveFormat(int sampleRate, int channels) { WaveFormat wf = new WaveFormat(); wf.waveFormatTag = WaveFormatEncoding.IeeeFloat; wf.channels = (short)channels; wf.bitsPerSample = 32; wf.sampleRate = sampleRate; wf.blockAlign = (short) (4*channels); wf.averageBytesPerSecond = sampleRate * wf.blockAlign; wf.extraSize = 0; return wf; } the winmm waveout.open api return 1. For expermentaion, i changed wf.waveFormatTag = WaveFormatEncoding.IeeeFloat; to wf.waveFormatTag = WaveFormatEncoding.PCM; then no error retuns. So what going on?
i am using the MixingSampleProvider for multiple Inputs.
```
BufferedWaveProvider dummy = new BufferedWaveProvider(new WaveFormat(48000, 16, 1));
MixingSampleProvider ss = new MixingSampleProvider(WaveFormat.CreateIeeeFloatWaveFormat(48000, 1));
ss.AddMixerInput(dummy);
WaveOut waveOut = new WaveOut();
waveOut.Init(ss);
waveOut.Play();
```
undefinied error while waveoutopen.interop
```
I think the floatWaveFormat is the reason, the same Code above works fine without the MixingSampleProvider.
waveOut.Init(dummy);
waveOut.Play();
```
But when i use DirectSound as waveOut both Codes are ist working fine.
Why?
Comments: Hello, something are wrong with the waveformat: public static WaveFormat CreateIeeeFloatWaveFormat(int sampleRate, int channels) { WaveFormat wf = new WaveFormat(); wf.waveFormatTag = WaveFormatEncoding.IeeeFloat; wf.channels = (short)channels; wf.bitsPerSample = 32; wf.sampleRate = sampleRate; wf.blockAlign = (short) (4*channels); wf.averageBytesPerSecond = sampleRate * wf.blockAlign; wf.extraSize = 0; return wf; } the winmm waveout.open api return 1. For expermentaion, i changed wf.waveFormatTag = WaveFormatEncoding.IeeeFloat; to wf.waveFormatTag = WaveFormatEncoding.PCM; then no error retuns. So what going on?
↧
Commented Unassigned: MixingSampleProvider+Waveout does not work [16484]
Hello,
i am using the MixingSampleProvider for multiple Inputs.
```
BufferedWaveProvider dummy = new BufferedWaveProvider(new WaveFormat(48000, 16, 1));
MixingSampleProvider ss = new MixingSampleProvider(WaveFormat.CreateIeeeFloatWaveFormat(48000, 1));
ss.AddMixerInput(dummy);
WaveOut waveOut = new WaveOut();
waveOut.Init(ss);
waveOut.Play();
```
undefinied error while waveoutopen.interop
```
I think the floatWaveFormat is the reason, the same Code above works fine without the MixingSampleProvider.
waveOut.Init(dummy);
waveOut.Play();
```
But when i use DirectSound as waveOut both Codes are ist working fine.
Why?
Comments: New Update: The WaveouOpen API does not accept a Samplerate of 48000. For example: 47999 and smaller works. 48001 and bigger works. Is this specified by my Hardware?
i am using the MixingSampleProvider for multiple Inputs.
```
BufferedWaveProvider dummy = new BufferedWaveProvider(new WaveFormat(48000, 16, 1));
MixingSampleProvider ss = new MixingSampleProvider(WaveFormat.CreateIeeeFloatWaveFormat(48000, 1));
ss.AddMixerInput(dummy);
WaveOut waveOut = new WaveOut();
waveOut.Init(ss);
waveOut.Play();
```
undefinied error while waveoutopen.interop
```
I think the floatWaveFormat is the reason, the same Code above works fine without the MixingSampleProvider.
waveOut.Init(dummy);
waveOut.Play();
```
But when i use DirectSound as waveOut both Codes are ist working fine.
Why?
Comments: New Update: The WaveouOpen API does not accept a Samplerate of 48000. For example: 47999 and smaller works. 48001 and bigger works. Is this specified by my Hardware?
↧
New Post: write wasapicapture and wasapiloopbackcapture to the same file
I'm trying to write the input from a WasapiCapture and WasapiLoopbackCapture to a wave file similar to http://stackoverflow.com/questions/19676932/naudio-recording-multiple-line-in. I don't want to write to separate files and then mix later but would rather mix before writing to 1 file. I'm having trouble wrapping my head around the more complex method described in http://stackoverflow.com/a/19679279. Can somebody give a more detailed example?
↧
New Post: WaveFileWriter leaving 0.0325s of dead audio at beginning
I've written the following code to stream from shoutcast into an app I'm writing with NAudio library. I keep running into every saved wave file, there is about 0.0325 seconds of dead silence at the beginning when writing to file with WaveFileWriter.CreateWaveFile. Does anyone know why this might be happening? Is it because I'm taking from the memory stream perhaps at incomplete intervals? How can I correctly grab a streaming file in chunks and save to file without having gaps?
The stream needs to be truncated like this because it is a web radio stream that runs indefinitely. I'm chopping it into temp files to be played in Unity3D (which is aside from the point in the scope of this question).
The stream needs to be truncated like this because it is a web radio stream that runs indefinitely. I'm chopping it into temp files to be played in Unity3D (which is aside from the point in the scope of this question).
public void StreamMP3FromUrl()
{
var response = WebRequest.Create(url).GetResponse();
using (var stream = response.GetResponseStream())
{
byte[] buffer = new byte[65536]; // 64KB chunks
int read;
Debug.Log("Buffering");
while ((read = stream.Read(buffer, 0, buffer.Length)) > 0)
{
var pos = ms.Position;
ms.Position = ms.Length;
ms.Write(buffer, 0, read);
ms.Position = pos;
msLength = (int)ms.Length;
if (ms.Length > 32768.0f * 10)
{
CreateWaveFile(ms);
ms = new MemoryStream();
}
}
Debug.Log("No data");
// ms.Position = 0;
}
Debug.Log("StreamMP3FromUrl ended");
}
int soundIndex = 0;
private void CreateWaveFile(Stream stream)
{
using (Mp3FileReader reader = new Mp3FileReader(stream))
{
WaveFileWriter.CreateWaveFile("temp" + soundIndex.ToString() + ".wav", reader);
soundIndex += 1;
}
}
↧
↧
New Post: write wasapicapture and wasapiloopbackcapture to the same file
It's fairly tricky to write, Basically when audio arrives from both sources, I'd put it into a BufferedWaveProvider. If both BufferedWaveProviders have more than a set amount of data in, mix that amount of audio into the mixed file. Otherwise wait for more to arrive. Then when recording ends, make sure you mix the last leftovers. I'm afraid I don't have a code sample to share at the moment.
↧
New Post: WaveFileWriter leaving 0.0325s of dead audio at beginning
Hello,
as far as I know a radio stream contains not only of audio chunks (in fact often mp3 frames) but also of info chunks. Seems you write them to wave file also.
If not and this is real audio, I can help you few. Assuming a waveformat of 44.1kHz, 2Ch, 16Bit you can calculate the size of your dead audio in bytes:
Dead audio size in bytes = 0.0325s * 44100 1/s * 2 * 2 bytes = 5733 bytes
Hope this helps. Greetz,
Freefall
as far as I know a radio stream contains not only of audio chunks (in fact often mp3 frames) but also of info chunks. Seems you write them to wave file also.
If not and this is real audio, I can help you few. Assuming a waveformat of 44.1kHz, 2Ch, 16Bit you can calculate the size of your dead audio in bytes:
Dead audio size in bytes = 0.0325s * 44100 1/s * 2 * 2 bytes = 5733 bytes
Hope this helps. Greetz,
Freefall
↧
New Post: NAudio on .NET Framework 4.6, any known issues?
Hi Mark, et al,
I'm migrating my app up to .NET Framework 4.5.2 (or maybe .NET 4.6). NAudio compiles fine for 4.5.2 and the app runs as expected in early testing. I'm just wondering if anyone has compiled NAudio to target .NET 4.6 and whether there are any known issues with NAudio libraries on that version.
These are the two dlls I use.
NAudio.dll
NAudio.WindowsMediaFormat.dll
I'm migrating my app up to .NET Framework 4.5.2 (or maybe .NET 4.6). NAudio compiles fine for 4.5.2 and the app runs as expected in early testing. I'm just wondering if anyone has compiled NAudio to target .NET 4.6 and whether there are any known issues with NAudio libraries on that version.
These are the two dlls I use.
NAudio.dll
NAudio.WindowsMediaFormat.dll
↧